https://kidger.site/thoughts/torch2jax/ Learning JAX as a PyTorch developerPatrick KidgerPosted on November 9, 2023 “Assuming you already know PyTorch, this is what I’ve found that you need to know to get up
What I Read: Nvidia, GPU gold rush
https://blog.johnluttig.com/p/nvidia-envy-understanding-the-gpu Nvidia Envy: understanding the GPU gold rushJohn LuttigNov 10, 2023 “In 2023, thousands of companies and countries begged Nvidia to purchase more GPUs. Can the exponential demand endure?”
What I Read: Unify Batch and ML Systems
https://www.kdnuggets.com/2023/09/hopsworks-unify-batch-ml-systems-feature-training-inference-pipelines Unify Batch and ML Systems with Feature/Training/Inference PipelinesBy Jim Dowling, Co-Founder & CEO, HopsworksSeptember 27, 2023 “This article introduces a unified architectural pattern for building both Batch and Real-Time
What I Read: Distributed Training, Finetuning
https://sumanthrh.com/post/distributed-and-efficient-finetuning/ Everything about Distributed Training and Efficient FinetuningSumanth R HegdeLast updated on Oct 13, 2023 “practical guidelines and gotchas with multi-GPU and multi-node training”