AIBase
Home
AI NEWS
AI Tools
AI Models
MCP
AI Services
AI Compute
AI Tutorial
EN

AI News

View More

Ant Group Unveils Two Innovative MoE Large Language Models with Significantly Reduced Training Costs

Ant Group's Ling team recently published a preprint on arXiv titled "Every FLOP Matters: Scaling a 300-billion parameter Mixture-of-Experts LING model without high-end GPUs," detailing two novel large language models: Ling-Lite and Ling-Plus. These models incorporate several innovations enabling efficient training on low-performance hardware, significantly reducing training costs.

10.4k 1 days ago
Ant Group Unveils Two Innovative MoE Large Language Models with Significantly Reduced Training Costs
AIBase
Empowering the future, your artificial intelligence solution think tank
English简体中文繁體中文にほんご
FirendLinks:
AI Newsletters AI ToolsMCP ServersAI NewsAIBaseLLM LeaderboardAI Ranking
© 2025AIBase
Business CooperationSite Map