https://drscotthawley.github.io/blog/posts/Transformers1-Attention.html To Understand Transformers, Focus on AttentionScott H. HawleyAugust 21, 2023 “To Understand Transformers, Focus on Attention”
What I Read: LLM-based Products
https://eugeneyan.com/writing/llm-patterns/ Patterns for Building LLM-based Systems & Productseugeneyan “This post is about practical patterns for integrating large language models (LLMs) into systems and products.”