Yi-1.5 is an upgraded version of the Yi model. It continues pre-training on the basis of Yi using high-quality corpora of 500 billion tokens and is fine-tuned on 3 million diverse fine-tuning samples. Compared with Yi, it performs better in coding, mathematics, reasoning, and instruction-following abilities, while maintaining excellent capabilities in language understanding, common-sense reasoning, and reading comprehension.
Natural Language Processing
Transformers