AIBase
Home
AI NEWS
AI Tools
AI Models
MCP
AI Services
AI Compute
AI Tutorial
EN

AI News

View More

Ant Group and inclusionAI Jointly Launch Ming-Omni: The First Open Source Multi-modal GPT-4o

No description available

97.3k 7 hours ago
Ant Group and inclusionAI Jointly Launch Ming-Omni: The First Open Source Multi-modal GPT-4o

Models

View More

Ming Flash Omni Preview

inclusionAI

M

The Ming-flash-omni preview version is a multimodal large model built on the Ling-Flash-2.0 sparse mixture of experts (MoE) architecture, with a total of 100B parameters and only 6B parameters activated per token. This model is a comprehensive upgrade based on Ming-Omni, showing significant improvements in multimodal understanding and generation, especially in speech recognition, image generation, and segmentation editing.

MultimodalDiffusersDiffusersEnglish
inclusionAI
1.4k
31
AIBase
Empowering the future, your artificial intelligence solution think tank
English简体中文繁體中文にほんご
FirendLinks:
AI Newsletters AI ToolsMCP ServersAI NewsAIBaseLLM LeaderboardAI Ranking
© 2025AIBase
Business CooperationSite Map