Falcon-H1 is a new family of large language models developed by the Technology Innovation Institute of the United Arab Emirates. It adopts a hybrid architecture that combines the Transformer attention mechanism and the state space model (SSMs), with excellent long-context memory ability and computational efficiency. This series of models offers various configurations with parameters ranging from 0.5B to 34B, and performs excellently in tasks such as inference, mathematics, and multilingual tasks.
Natural Language Processing
TransformersMultiple Languages