Chronos-33b is a large language model with 33 billion parameters, focusing on chat conversations, role-playing, and story creation. Based on the Transformer architecture, the model can generate coherent long-text outputs and also has simple reasoning and coding abilities. The model is trained using the Alpaca format and performs excellently on various natural language processing tasks.
Natural Language Processing
Transformers