https://www.quantamagazine.org/ai-overcomes-stumbling-block-on-brain-inspired-hardware-20220217/ AI Overcomes Stumbling Block on Brain-Inspired HardwareAllison WhittenContributing WriterFebruary 17, 2022 “Algorithms that use the brain’s communication signal can now work on analog neuromorphic chips, which closely mimic our
What I Read: Bootstrapping Labels
https://thegradient.pub/bootstrapping-labels-via-_-supervision-human-in-the-loop/ Bootstrapping Labels via _ Supervision & Human-In-The-LoopEugene Yan05.Mar.2022 “Collecting training labels is a seldom discussed art…. In this write-up, we’ll discuss semi, active, and weakly supervised learning, and see
What I Read: Will Transformers Take Over Artificial Intelligence?
https://www.quantamagazine.org/will-transformers-take-over-artificial-intelligence-20220310/ Will Transformers Take Over Artificial Intelligence?A simple algorithm that revolutionized how neural networks approach language is now taking on vision as well. It may not stop there.Stephen OrnesContributing WriterMarch
What I Read: Data Observability vs. Data Testing
https://towardsdatascience.com/data-observability-vs-data-testing-everything-you-need-to-know-6f3d7193b388?gi=6618bd7121fd Data Observability vs. Data Testing: Everything You Need to KnowYou already test your data. Do you need data observability, too?Lior GavishFeb 12 “In any data system, there are two
What I Read: Why Bigger Neural Networks Do Better
https://www.quantamagazine.org/computer-scientists-prove-why-bigger-neural-networks-do-better-20220210/ Computer Scientists Prove Why Bigger Neural Networks Do BetterMordechai RorvigStaff WriterFebruary 10, 2022 “Two researchers show that for neural networks to be able to remember better, they need far