Linear solvers in JAX and Equinox. https://docs.kidger.site/lineax
? Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Deep Learning for humans
100 Days of ML Coding
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
? Diffusers: State-of-the-art diffusion models for image, video, and audio generation in PyTorch and FLAX.
Interactive deep learning book with multi-framework code, math, and discussions. Adopted at 500 universities from 70 countries including Stanford, MIT, Harvard, and Cambridge.
? A ranked list of awesome machine learning Python libraries. Updated weekly.
A collection of algorithms and data structures
RWKV (pronounced RwaKuv) is an RNN with great LLM performance, which can also be directly trained like a GPT transformer (parallelizable). We are at RWKV-7 "Goose". So it's combining the best of RNN and transformer - great performance, linear time, constant space (no kv-cache), fast training, infinite ctx_len, and free sentence embedding.
MIT Deep Learning Book in PDF format (complete and parts) by Ian Goodfellow, Yoshua Bengio and Aaron Courville