SmolLM2-1.7B
A lightweight language model with 1.7 billion parameters, suitable for diverse tasks.
CommonProductProgrammingText GenerationLightweight Model
SmolLM2 is a series of lightweight language models, featuring versions with 135M, 360M, and 1.7B parameters. These models effectively handle a wide range of tasks while maintaining a lightweight profile, particularly for device deployment. The 1.7B version shows significant improvements over its predecessor, SmolLM1-1.7B, in instruction-following, knowledge, reasoning, and mathematics. It has been trained on multiple datasets, including FineWeb-Edu, DCLM, and The Stack, and has undergone Direct Preference Optimization (DPO) using UltraFeedback. The model also supports tasks such as text rewriting, summarization, and functional invocation.
SmolLM2-1.7B Visit Over Time
Monthly Visits
25537072
Bounce Rate
44.24%
Page per Visit
5.9
Visit Duration
00:04:47