Tencent HY-MT has officially announced the open-source release of its translation model version 1.5 today. This update includes two models of different sizes: Tencent-HY-MT1.5-1.8B and Tencent-HY-MT1.5-7B, aiming to redefine the translation experience of edge-cloud collaboration with extreme efficiency and leading translation quality.

Key Highlights: Edge Deployment and Outstanding Performance
The 1.8B model released this time stands out particularly. As a lightweight model designed for consumer devices such as smartphones, it can run smoothly offline with only 1GB of memory after quantization.
Extreme Speed: The average time to process 50 tokens is just 0.18 seconds, which is much faster than the 0.4 seconds of mainstream commercial translation APIs.
Superior Performance: In authoritative test sets such as FLORES-200, its performance reaches the 90th percentile level of large closed-source models like Gemini-3.0-Pro, surpassing medium-sized open-source models comprehensively.

Comprehensive Coverage: From Mainstream Languages to Dialects and Chinese
The HY-MT 1.5 model supports mutual translation of 33 global languages, including Chinese, English, Japanese, and French, and especially strengthens support for smaller languages such as Czech, Estonian, and Icelandic. In addition, the model covers 5 domestic minority languages and dialects, greatly expanding the application boundaries of AI translation.
Functional Evolution: More Practical Translation Experience
Regarding actual application scenarios, the 1.5 version has made significant upgrades in three dimensions:
Custom Terminology Library: Users can upload terminology lists for specialized fields such as medicine, law, and finance to ensure consistent translation of professional terms.
Context Understanding: It has advanced long-text dialogue comprehension capabilities, optimizing subsequent results based on the context of previous text, avoiding semantic breaks.
Format Preservation Ability: Through precise instruction following, the model can perfectly preserve the original text format (such as web pages, code, and Markdown) after translation.
Technical Breakthrough: Large Model Guiding Small Model
The reason why HY-MT1.5-1.8B achieves so much with so little is due to Tencent's On-Policy Distillation (large model distillation) strategy. A 7B-sized "teacher" model guides the "student" model in real time, helping it learn from prediction deviations rather than simply memorizing answers, thus significantly improving the small model's logical and translation abilities.
Developer Ecosystem: Full Platform Support
Currently, the HY-MT 1.5 model is available on the
From Tencent Meeting to Enterprise WeChat, Tencent HY translation technology has been implemented in multiple internal high-concurrency scenarios. With the open source of the 1.5 version, Tencent is further promoting high-quality AI translation technology toward inclusivity, providing global developers with a more cost-effective translation solution.




