NorMistral-7b-scratch is a large-scale language model specifically designed for Norwegian, developed in collaboration by the Language Technology Group at the University of Oslo, the HPLT project, the National Library of Norway, and the University of Turku. This model is based on the Mistral architecture, has 7 billion parameters, and was pre-trained from scratch on a Norwegian corpus of 260 billion subword tokens. It is an important part of the NORA.LLM series.
Natural Language Processing
TransformersOther