Moonlight-16B-A3B
Moonlight-16B-A3B is a 16B parameter Mixture-of-Experts (MoE) model trained with the Muon optimizer for efficient language generation.
Moonlight-16B-A3B Visit Over Time
Monthly Visits
25296546
Bounce Rate
43.31%
Page per Visit
5.8
Visit Duration
00:04:45
Moonlight-16B-A3B Visit Trend
Moonlight-16B-A3B Visit Geography
No Geography Data