https://transformer-circuits.pub/2022/toy_model/index.html Toy Models of SuperpositionNelson Elhage, Tristan Hume, Catherine Olsson, Nicholas Schiefer, Tom Henighan, Shauna Kravec, Zac Hatfield-Dodds, Robert Lasenby, Dawn Drain, Carol Chen, Roger Grosse, Sam McCandlish, Jared Kaplan,
What I Read: passively learned, causality
What can be passively learned about causality?Simons InstituteAndrew Lampinen (Google DeepMind)Jun 25, 2024 “What could language models learn about causality and experimentation from their passive training?”
What I Read: Statistical Critiques That Don’t Quite Work
https://nickch-k.github.io/SomeThoughts/posts/2022-01-23-overdebunked/ Overdebunked! Six Statistical Critiques That Don’t Quite WorkWhen healthy skepticism of statistics turns into worse statistics (and an excuse).Nick Huntington-KleinJan. 23, 2022 “Skepticism about statistics is good. However, just