HomeAI Tutorial

llm-vram-calculator

Public

Accurate VRAM calculator for Local LLMs (Llama 4, DeepSeek V3, Qwen 2.5). Calculates GGUF quantization, GQA context overhead, and offloading limits

Creat2025-11-28T00:52:50
Update2025-12-02T07:54:09
https://gpuforllm.com/
2
Stars
0
Stars Increase

Related projects