The Stable LM 2 1.6B is a compact multilingual stable language model with 160 million parameters, supporting English, Spanish, German, Italian, French, Portuguese, and Dutch. With its small size and fast performance, this model reduces hardware barriers, enabling more developers to participate in the generative AI ecosystem. We not only release pre-trained and tuned versions but also for the first time release the final checkpoint before pre-training cooling, including optimizer states, to assist developers in smooth adjustments and experiments.