AIBase
Home
AI NEWS
AI Tools
AI Models
MCP
AI Services
AI Compute
AI Tutorial
EN

AI News

View More

Ant Group Unveils Two Innovative MoE Large Language Models with Significantly Reduced Training Costs

Ant Group's Ling team recently published a preprint on arXiv titled "Every FLOP Matters: Scaling a 300-billion parameter Mixture-of-Experts LING model without high-end GPUs," detailing two novel large language models: Ling-Lite and Ling-Plus. These models incorporate several innovations enabling efficient training on low-performance hardware, significantly reducing training costs.

9.4k 5 hours ago
Ant Group Unveils Two Innovative MoE Large Language Models with Significantly Reduced Training Costs

Models

View More

Ling Lite 1.5

inclusionAI

L

Ling is a large-scale Mixture of Experts (MoE) language model open-sourced by InclusionAI. The Lite version features 16.8 billion total parameters with 2.75 billion activated parameters, demonstrating exceptional performance.

Natural Language ProcessingTransformersTransformers
inclusionAI
46
3

InclusionAI_Ling Lite 0415 GGUF

bartowski

I

A quantized version based on inclusionAI/Ling-lite, using llama.cpp for imatrix quantization, supporting multiple quantization types to accommodate different hardware requirements.

Natural Language ProcessingGgufGguf
bartowski
3.1k
6
AIBase
Empowering the future, your artificial intelligence solution think tank
English简体中文繁體中文にほんご
FirendLinks:
AI Newsletters AI ToolsMCP ServersAI NewsAIBaseLLM LeaderboardAI Ranking
© 2025AIBase
Business CooperationSite Map