InternLM-20B is a 20-billion-parameter pretrained model jointly released by Shanghai AI Laboratory, SenseTime, The Chinese University of Hong Kong, and Fudan University. Pretrained on over 2.3T high-quality Chinese-English and code token data, it demonstrates outstanding comprehensive performance and powerful tool-calling capabilities.
Natural Language Processing
Transformers