JetMoE-8B is an open-source large language model that achieves performance surpassing Meta AI LLaMA2-7B at a cost of less than $100,000 by utilizing public datasets and optimized training methods. During inference, the model activates only 2.2 billion parameters, significantly reducing computational cost while maintaining excellent performance.