INTELLECT-3 is a Mixture of Experts (MoE) model with 106 billion parameters, trained through large-scale reinforcement learning. It demonstrates excellent performance in mathematics, coding, and reasoning benchmark tests. The model, training framework, and environment are all open-sourced under a permissive license.
Natural Language Processing
TransformersEnglish