https://blog.alexalemi.com/kl-is-all-you-need.html KL is All You NeedAlexander A. Alemi2024-01-08 “…the core of essentially all modern machine learning methods is a single universal objective: Kullback-Leibler (KL) divergence minimization…. Understand KL, understand the
What I Read: binary cross-entropy, log loss
https://towardsdatascience.com/understanding-binary-cross-entropy-log-loss-a-visual-explanation-a3ac6025181a?gi=375ce73be21b Understanding binary cross-entropy / log loss: a visual explanationDaniel GodoyNov 21, 2018 “If you are training a binary classifier, chances are you are using binary cross-entropy / log loss