https://arxiv.org/abs/2409.18842 Classical Statistical (In-Sample) Intuitions Don’t Generalize Well: A Note on Bias-Variance Tradeoffs, Overfitting and Moving from Fixed to Random DesignsAlicia Curth27 Sep 2024 “…we show that classical intuitions relating
What I Read: transfer learning
https://lunar-joke-35b.notion.site/Transfer-Learning-101-133ba4b6a3fa800e8cede11ee3f1c1cd Transfer Learning 101Himanshu DubeyNov 5, 2024 “Let’s understand Transfer Learning in greater detail.”
What I Read: Model Merging
https://planetbanatt.net/articles/modelmerging.html Model Merging and YouEryk BanattAugust 2024 “Model Merging is a weird and experimental technique which lets you take two models and combine them together to get a new model.”
What I Read: optimizing softmax
https://maharshi.bearblog.dev/optimizing-softmax-cuda Learning CUDA by optimizing softmax: A worklogMaharshi Pandya04 Jan, 2025 “Optimizing softmax, especially in the context of GPU programming with CUDA, presents many opportunities for learning.”
What I Read: ScyllaDB
https://medium.com/@abdurohman/mind-blowing-postgresql-meets-scylladbs-lightning-speed-and-monstrous-scalability-7dcda1eb1cea Mind-blowing: PostgreSQL Meets ScyllaDB’s Lightning Speed and Monstrous ScalabilityAbdurohmanDecember 24, 2024 “While PostgreSQL excels in many use cases, our experience shows that for write-heavy, high-scale operations, the distributed architecture