https://medium.com/georgian-impact-blog/how-to-incorporate-tabular-data-with-huggingface-transformers-b70ac45fcfb4 How to Incorporate Tabular Data with HuggingFace TransformersGeorgianOct 23 “At Georgian, we find ourselves working with supporting tabular feature information as well as unstructured text data. We found that
What I Read: Revisiting Sutton’s Bitter Lesson for AI
https://blog.exxactcorp.com/compute-goes-brrr-revisiting-suttons-bitter-lesson-artificial-intelligence/ Deep LearningCompute Goes Brrr: Revisiting Sutton’s Bitter Lesson for Artificial IntelligenceMarketing, October 27, 2020 “The main driver of AI progress, according to Sutton, is the increasing availability of compute
What I Read: Attention with Performers
https://ai.googleblog.com/2020/10/rethinking-attention-with-performers.html Rethinking Attention with PerformersFriday, October 23, 2020Posted by Krzysztof Choromanski and Lucy Colwell, Research Scientists, Google Research “To resolve these issues, we introduce the Performer, a Transformer architecture with
What I Read: Transformers for Image Recognition
https://medium.com/swlh/an-image-is-worth-16×16-words-transformers-for-image-recognition-at-scale-brief-review-of-the-8770a636c6a8 An Image Is Worth 16×16 Words: Transformers for Image Recognition at Scale (Brief Review of the ICLR 2021 Paper)Stan KriventsovOct 9 “The reason attention models haven’t been doing better
What I Read: Transformer Architecture
https://blog.exxactcorp.com/a-deep-dive-into-the-transformer-architecture-the-development-of-transformer-models/ Deep LearningA Deep Dive Into the Transformer Architecture – The Development of Transformer ModelsMarketing, July 14, 2020 0 11 min readTransformers for Natural Language Processing “There’s no better time
What I Read: Progress of Natural Language Processing
https://blog.exxactcorp.com/the-unreasonable-progress-of-deep-neural-networks-in-natural-language-processing-nlp/ Deep LearningThe Unreasonable Progress of Deep Neural Networks in Natural Language Processing (NLP)Marketing, June 2, 2020 0 14 min read “With the advent of pre-trained generalized language models, we