Qwen1.5-MoE-A2.7B
A large-scale MoE (Mixture of Experts) language model whose performance rivals that of 70 billion parameter models.
Qwen1.5-MoE-A2.7B Visit Over Time
Monthly Visits
897447
Bounce Rate
61.70%
Page per Visit
1.4
Visit Duration
00:00:34