LLaVA-MoD
Public[ICLR 2025] LLaVA-MoD: Making LLaVA Tiny via MoE-Knowledge Distillation
hallucinationknowledge-distillationllavallmmixture-of-expertsmllmmoemultimodal-large-language-modelsqwenrlhf
Creat:2024-08-26T20:23:08
Update:2025-04-10T16:27:46
https://openreview.net/pdf?id=uWtLOy35WD
186
Stars
1
Stars Increase