AIBase
Home
AI NEWS
AI Tools
AI Models
MCP
AI Services
AI Compute
AI Tutorial
EN

AI News

View More

Yuanxiang Releases MoE Open Source Large Model XVERSE-MoE-A36B with 36 Billion Active Parameters

Shenzhen Yuanxiang Information Technology Co., Ltd. recently announced the successful release of China’s largest Mixture of Experts (MoE) open source large model — XVERSE-MoE-A36B. The launch of this model marks a significant advancement in China’s AI field, elevating domestic open source technology to an internationally leading level.

10.4k 1 days ago
Yuanxiang Releases MoE Open Source Large Model XVERSE-MoE-A36B with 36 Billion Active Parameters

AI Products

View More
XVERSE-MoE-A36B

XVERSE-MoE-A36B

A large multilingual language model that supports text generation across various fields.

AI model
9.4k
AIBase
Empowering the future, your artificial intelligence solution think tank
English简体中文繁體中文にほんご
FirendLinks:
AI Newsletters AI ToolsMCP ServersAI NewsAIBaseLLM LeaderboardAI Ranking
© 2025AIBase
Business CooperationSite Map