GLM-4-32B-0414 is a new member of the GLM family, a high-performance large language model with 32 billion parameters. This model was pre-trained on 15T of high-quality data, including a large amount of synthetic reasoning data, and performs excellently in multiple task scenarios such as code generation, function calls, and search-based Q&A. Its performance can rival that of larger-scale models like GPT-4o and DeepSeek-V3.
Natural Language Processing
TransformersMultiple Languages