MoH: Multi-Head Attention as Mixture-of-Head Attention
21 Lessons, Get Started Building with Generative AI ? https://microsoft.github.io/generative-ai-for-beginners/
??? 60+ Implementations/tutorials of deep learning papers with side-by-side notes ?; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), ? reinforcement learning (ppo, dqn), capsnet, distillation, ... ?
Collection of awesome LLM apps with AI Agents and RAG using OpenAI, Anthropic, Gemini and opensource models.
Finetune Llama 3.3, DeepSeek-R1 & Reasoning LLMs 2x faster with 70% less memory! ?
Learn how to design, develop, deploy and iterate on production-grade ML applications.
提取微信聊天记录,将其导出成HTML、Word、Excel文档永久保存,对聊天记录进行分析生成年度聊天报告,用聊天数据训练专属于个人的AI聊天助手
AI's query engine - Platform for building AI that can learn and answer questions over large scale federated data.
Self-hosted AI coding assistant
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
An open-source RAG-based tool for chatting with your documents.