https://blog.alexalemi.com/kl-is-all-you-need.html KL is All You NeedAlexander A. Alemi2024-01-08 “…the core of essentially all modern machine learning methods is a single universal objective: Kullback-Leibler (KL) divergence minimization…. Understand KL, understand the
What I Read: Adversarial Attacks on LLMs
https://lilianweng.github.io/posts/2023-10-25-adv-attack-llm/ Adversarial Attacks on LLMsLilian WengOctober 25, 2023 “Adversarial attacks are inputs that trigger the model to output something undesired.”
What I Read: binary cross-entropy, log loss
https://towardsdatascience.com/understanding-binary-cross-entropy-log-loss-a-visual-explanation-a3ac6025181a?gi=375ce73be21b Understanding binary cross-entropy / log loss: a visual explanationDaniel GodoyNov 21, 2018 “If you are training a binary classifier, chances are you are using binary cross-entropy / log loss