https://structural-time-series.fastforwardlabs.com/Structural Time SeriesFF16 · October 2020 “We will describe a family of models—Generalized Additive Models, or GAMs—which have the advantage of being scalable and easy to interpret, and tend to
What I Read: Frameworks Scaling Deep Learning Training
https://medium.com/dataseries/microsoft-and-google-open-sourced-these-frameworks-based-on-their-work-scaling-deep-learning-c0510e907038 Microsoft and Google Open Sourced These Frameworks Based on Their Work Scaling Deep Learning TrainingGoogle and Microsoft have recently released new frameworks for distributed deep learning training.Jesus RodriguezOct 26
What I Read: Switchback Tests and Randomized Experimentation Under Network Effects
https://medium.com/@DoorDash/switchback-tests-and-randomized-experimentation-under-network-effects-at-doordash-f1d938ab7c2a Switchback Tests and Randomized Experimentation Under Network Effects at DoorDashDoorDashFeb 14, 2018·12 min readDavid Kastelman, Data Scientist & Raghav Ramesh,Machine Learning Engineer “…given the systemic nature of many of
What I Read: Attention with Performers
https://ai.googleblog.com/2020/10/rethinking-attention-with-performers.html Rethinking Attention with PerformersFriday, October 23, 2020Posted by Krzysztof Choromanski and Lucy Colwell, Research Scientists, Google Research “To resolve these issues, we introduce the Performer, a Transformer architecture with