https://ericmjl.github.io/essays-on-data-science/machine-learning/graph-nets/ An attempt at demystifying graph deep learningEric Ma “There are a ton of great explainers of what graph neural networks are. However, I find that a lot of them
What I Read: Attention with Performers
https://ai.googleblog.com/2020/10/rethinking-attention-with-performers.html Rethinking Attention with PerformersFriday, October 23, 2020Posted by Krzysztof Choromanski and Lucy Colwell, Research Scientists, Google Research “To resolve these issues, we introduce the Performer, a Transformer architecture with
What I Read: Transformers for Image Recognition
https://medium.com/swlh/an-image-is-worth-16×16-words-transformers-for-image-recognition-at-scale-brief-review-of-the-8770a636c6a8 An Image Is Worth 16×16 Words: Transformers for Image Recognition at Scale (Brief Review of the ICLR 2021 Paper)Stan KriventsovOct 9 “The reason attention models haven’t been doing better
What I Read: Transformer Architecture
https://blog.exxactcorp.com/a-deep-dive-into-the-transformer-architecture-the-development-of-transformer-models/ Deep LearningA Deep Dive Into the Transformer Architecture – The Development of Transformer ModelsMarketing, July 14, 2020 0 11 min readTransformers for Natural Language Processing “There’s no better time
What I Read: Progress of Natural Language Processing
https://blog.exxactcorp.com/the-unreasonable-progress-of-deep-neural-networks-in-natural-language-processing-nlp/ Deep LearningThe Unreasonable Progress of Deep Neural Networks in Natural Language Processing (NLP)Marketing, June 2, 2020 0 14 min read “With the advent of pre-trained generalized language models, we