http://jonathanstray.com/to-apply-ai-for-good-think-form-extraction To apply AI for good, think form extractionPosted on October 24, 2020 Author Jonathan Stray “Folks who want to use AI/ML for good generally think of things like building
What I Read: This AI learns by reading the web
https://www.technologyreview.com/2020/09/04/1008156/knowledge-graph-ai-reads-web-machine-learning-natural-language-processing/ This know-it-all AI learns by reading the entire web nonstopDiffbot is building the biggest-ever knowledge graph by applying image recognition and natural-language processing to billions of web pages.Will Douglas
What I Read: Transformer Architecture
https://blog.exxactcorp.com/a-deep-dive-into-the-transformer-architecture-the-development-of-transformer-models/ Deep LearningA Deep Dive Into the Transformer Architecture – The Development of Transformer ModelsMarketing, July 14, 2020 0 11 min readTransformers for Natural Language Processing “There’s no better time
What I Read: Progress of Natural Language Processing
https://blog.exxactcorp.com/the-unreasonable-progress-of-deep-neural-networks-in-natural-language-processing-nlp/ Deep LearningThe Unreasonable Progress of Deep Neural Networks in Natural Language Processing (NLP)Marketing, June 2, 2020 0 14 min read “With the advent of pre-trained generalized language models, we
What I Read: Reformer efficient Transformer
https://towardsdatascience.com/illustrating-the-reformer-393575ac6ba0?gi=34b920510f6f Illustrating the ReformerThe efficient TransformerAlireza DirafzoonFeb 4 “Recently, Google introduced the Reformer architecture, a Transformer model designed to efficiently handle processing very long sequences of data (e.g. up to