https://huyenchip.com//2024/01/16/sampling.html Sampling for Text GenerationChip Huyen1/15/24 “ML models are probabilistic…. This probabilistic nature makes AI great for creative tasks…. However, this probabilistic nature also causes inconsistency and hallucinations. It’s fatal
What I Read: Confidence intervals, balanced accuracy
https://code.groundlight.ai/python-sdk/blog/confidence-intervals-for-balanced-accuracy Tales from the Binomial Tail: Confidence intervals for balanced accuracyTed SandlerSenior Applied Scientist at GroundlightLeo DiracCTO and Co-founder at GroundlightJanuary 16, 2024 “…we put careful thought into measuring the
What I Read: Few-Shot Learning
https://saturncloud.io/blog/breaking-the-data-barrier-how-zero-shot-one-shot-and-few-shot-learning-are-transforming-machine-learning/ Breaking the Data Barrier: How Zero-Shot, One-Shot, and Few-Shot Learning are Transforming Machine LearningBy Christophe AttenWednesday, May 17, 2023 “The improvements made over the last almost two decades have
What I Read: Multi-label NLP
https://www.kdnuggets.com/2023/03/multilabel-nlp-analysis-class-imbalance-loss-function-approaches.html Multi-label NLP: An Analysis of Class Imbalance and Loss Function ApproachesOleksii Babych, Machine Learning Engineer at ProvectusMarch 17, 2023 “Multi-label NLP refers to the task of assigning multiple labels
What I Read: binary cross-entropy, log loss
https://towardsdatascience.com/understanding-binary-cross-entropy-log-loss-a-visual-explanation-a3ac6025181a?gi=375ce73be21b Understanding binary cross-entropy / log loss: a visual explanationDaniel GodoyNov 21, 2018 “If you are training a binary classifier, chances are you are using binary cross-entropy / log loss