https://www.economist.com/technology-quarterly/2020/06/11/an-understanding-of-ais-limitations-is-starting-to-sink-in Technology QuarterlyJun 11th 2020 editionAn understanding of AI’s limitations is starting to sink inAfter years of hype, many people feel AI has failed to deliver, says Tim Cross “Surveying
What I Read: This AI learns by reading the web
https://www.technologyreview.com/2020/09/04/1008156/knowledge-graph-ai-reads-web-machine-learning-natural-language-processing/ This know-it-all AI learns by reading the entire web nonstopDiffbot is building the biggest-ever knowledge graph by applying image recognition and natural-language processing to billions of web pages.Will Douglas
What I Read: Best Practices for Building Machine Learning at Scale
https://medium.com/dataseries/linkedins-pro-ml-architecture-summarizes-best-practices-for-building-machine-learning-at-scale-77fcb6afc9ec LinkedIn’s Pro-ML Architecture Summarizes Best Practices for Building Machine Learning at ScaleThe reference architecture is powering mission critical machine learning workflows within LinkedIn.Jesus RodriguezSep 17 “Building machine learning solutions
What I Read: Can Neural Networks Show Imagination?
https://medium.com/dataseries/can-neural-networks-show-imagination-deepmind-thinks-they-can-b3b874aece67 Can Neural Networks Show Imagination? DeepMind Thinks they CanDeepMind has done some of the relevant work in the area of simulating imagination in deep learning systems.Jesus RodriguezSep 14 “Incorporating
What I Read: Transformer Architecture
https://blog.exxactcorp.com/a-deep-dive-into-the-transformer-architecture-the-development-of-transformer-models/ Deep LearningA Deep Dive Into the Transformer Architecture – The Development of Transformer ModelsMarketing, July 14, 2020 0 11 min readTransformers for Natural Language Processing “There’s no better time