https://www.watchful.io/blog/a-surprisingly-effective-way-to-estimate-token-importance-in-llm-prompts A Surprisingly Effective Way to Estimate Token Importance in LLM PromptsShayan Mohanty “Prompting is our primary means of instructing these AIs, yet the process of crafting effective prompts has
What I Read: To Understand Transformers, Focus on Attention
https://drscotthawley.github.io/blog/posts/Transformers1-Attention.html To Understand Transformers, Focus on AttentionScott H. HawleyAugust 21, 2023 “To Understand Transformers, Focus on Attention”
What I Read: Attention with Performers
https://ai.googleblog.com/2020/10/rethinking-attention-with-performers.html Rethinking Attention with PerformersFriday, October 23, 2020Posted by Krzysztof Choromanski and Lucy Colwell, Research Scientists, Google Research “To resolve these issues, we introduce the Performer, a Transformer architecture with