https://arxiv.org/abs/1910.01075Learning Neural Causal Models from Unknown InterventionsNan Rosemary Ke, Olexa Bilaniuk, Anirudh Goyal, Stefan Bauer, Hugo Larochelle, Chris Pal, Yoshua Bengio “We present a new framework for meta-learning causal models
What I Read: Symbolic Mathematics, Neural Networks
https://www.quantamagazine.org/symbolic-mathematics-finally-yields-to-neural-networks-20200520/ artificial intelligenceSymbolic Mathematics Finally Yields to Neural NetworksAfter translating some of math’s complicated equations, researchers have created an AI system that they hope will answer even bigger questions.by Stephen
What I Read: Deep Generative Models
https://medium.com/@jrodthoughts/microsoft-research-unveils-three-efforts-to-advance-deep-generative-models-b1d2fe3395e8 Microsoft Research Unveils Three Efforts to Advance Deep Generative ModelsOptimus, FQ-GAN and Prevalent bring new ideas to apply generative models at large scale.Jesus RodriguezApr 27 “With the emergence of
What I Read: Tonks Multi-Task Model
https://medium.com/shoprunner/tonks-building-one-multi-task-model-to-rule-them-all-3e5d020f1f2b Tonks: Building One (Multi-Task) Model to Rule Them All!Co-written by Nicole Carlson and Michael SugimuraMichael SugimuraApr 28 “Tonks is a library that streamlines the training of multi-task PyTorch networks.
What I Read: sub-linear deep learning algorithm
https://www.kdnuggets.com/2020/03/deep-learning-breakthrough-sub-linear-algorithm-no-gpu.html Deep Learning Breakthrough: a sub-linear deep learning algorithm that does not need a GPU?By Anshumali Shrivastava, Rice University. “Backpropagation is not an efficient algorithm…. The core idea behind the