Granite-4.0-H-Small is a long-context instruction model developed by IBM with 32 billion parameters, fine-tuned based on Granite-4.0-H-Small-Base. This model combines open-source instruction datasets and internal synthetic datasets, and uses techniques such as supervised fine-tuning, reinforcement learning alignment, and model merging. It has significantly improved instruction following and tool invocation capabilities, and is particularly suitable for enterprise-level applications.
Natural Language Processing
Transformers