What I Read: sparsity, PyTorch, Hadamard product

https://alexshtf.github.io/2024/07/07/HadamardParameterization.html

Alex Shtoff
Fun with sparsity in PyTorch via Hadamard product parametrization
Jul 7, 2024


“The beauty of sparsity inducing regularization is that we let our optimizer discover the sparsity patterns, instead of doing extremely expensive neural architecture search. And the beauty of Hadamard-product parametrization is that it lets us re-use existing optimizers of our ML frameworks to add sparsity-inducing regularizers, without having to write specialized custom optimizers.”