https://www.asapp.com/blog/reducing-the-high-cost-of-training-nlp-models-with-sru/ Reducing the High Cost of Training NLP Models With SRU++By Tao Lei, PhDResearch Leader and Scientist at ASAPP “The Transformer architecture was proposed to accelerate model training in NLP….
What I Read: Difficulty of Graph Anonymisation
https://www.timlrx.com/blog/tracetogether-and-the-difficulty-of-graph-anonymisation Timothy Lin@timlrxxSunday, February 7, 2021TraceTogether and the Difficulty of Graph Anonymisation “The word “anonymised data” seems to convey a certain sense of certainty that user information cannot be back-derived.
What I Read: Data Quality Management
https://medium.com/dataseries/inside-the-architecture-powering-data-quality-management-at-uber-543d5e00ad19 Inside the Architecture Powering Data Quality Management at UberData Quality Monitor implements novel statistical methods for anomaly detection and quality management in large data infrastructures.Jesus RodriguezFeb 10 “Data quality
What I Read: Neural Nets, How Brains Learn
https://www.quantamagazine.org/artificial-neural-nets-finally-yield-clues-to-how-brains-learn-20210218/ Artificial Neural Nets Finally Yield Clues to How Brains LearnAnil AnanthaswamyContributing WriterFebruary 18, 2021 “The learning algorithm that enables the runaway success of deep neural networks doesn’t work in
What I Read: Continual Learning, Amnesia, Neural Networks
https://medium.com/dataseries/ibm-uses-continual-learning-to-avoid-the-amnesia-problem-in-neural-networks-ae8241e1f3a3 IBM Uses Continual Learning to Avoid The Amnesia Problem in Neural NetworksUsing continual learning might avoid the famous catastrophic forgetting problem in neural networks.Jesus RodriguezJan 25 “Building neural networks