AIBase
Home
AI NEWS
AI Tools
AI Models
MCP
AI Services
AI Compute
AI Tutorial
EN

AI News

View More

Xiaohongshu makes a major move! The all-new open-source large model dots.llm1震撼登场 with 142 billion parameters!

No description available

6k 4 days ago
Xiaohongshu makes a major move! The all-new open-source large model dots.llm1震撼登场 with 142 billion parameters!

Xiaohongshu Releases First Open-Source Large Model dots.llm1: 11.2 Trillion Synthetic Data Boosts Chinese Performance

No description available

9.2k 2 days ago
Xiaohongshu Releases First Open-Source Large Model dots.llm1: 11.2 Trillion Synthetic Data Boosts Chinese Performance

Models

View More

Dots.llm1.inst

rednote-hilab

D

dots.llm1 is a large-scale MoE model that activates 14 billion parameters out of a total of 142 billion parameters, and its performance is comparable to that of the state-of-the-art models.

Natural Language ProcessingTransformersTransformersMultiple Languages
rednote-hilab
440
97
AIBase
Empowering the future, your artificial intelligence solution think tank
English简体中文繁體中文にほんご
FirendLinks:
AI Newsletters AI ToolsMCP ServersAI NewsAIBaseLLM LeaderboardAI Ranking
© 2025AIBase
Business CooperationSite Map