TrAVis: Visualise BERT attention in your browser
? Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
??? 60+ Implementations/tutorials of deep learning papers with side-by-side notes ?; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), ? reinforcement learning (ppo, dqn), capsnet, distillation, ... ?
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
AI orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.
Machine learning, in numpy
《李宏毅深度学习教程》(李宏毅老师推荐?,苹果书?),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
Natural Language Processing Tutorial for Deep Learning Researchers
RWKV (pronounced RwaKuv) is an RNN with great LLM performance, which can also be directly trained like a GPT transformer (parallelizable). We are at RWKV-7 "Goose". So it's combining the best of RNN and transformer - great performance, linear time, constant space (no kv-cache), fast training, infinite ctx_len, and free sentence embedding.
? Easy-to-use and powerful NLP and LLM library with ? Awesome model zoo, supporting wide-range of NLP tasks from research to industrial applications, including ?Text Classification, ? Neural Search, ? Question Answering, ?? Information Extraction, ? Document Intelligence, ? Sentiment Analysis etc.
? Scalable embedding, reasoning, ranking for images and sentences with CLIP