At the 2025 Baidu Cloud Intelligence Conference held today, Shen Dou, Executive Vice President of Baidu Group and President of Baidu Intelligent Cloud Business Group, announced that Baidu Intelligent Cloud's Baige AI Computing Platform has been officially upgraded to the new version 5.0. This upgrade aims to effectively break through the efficiency bottlenecks in current AI computing by comprehensively enhancing capabilities in four areas: networking, computing power, inference systems, and training-inference integration systems.
In terms of networking, the fifth version of the Baige AI computing platform achieves faster communication speeds and lower latency, significantly improving the efficiency of model training and inference.
In terms of computing power, following the release of the Kunlun Chip Super Node at the Create 2025 Baidu AI Developer Conference in April this year, the upgraded fifth version of the Baige AI computing platform has been officially launched on Baidu Intelligent Cloud's public cloud service, providing users with super computing power. Shen Dou said that the largest open-source model parameters in the industry have reached the trillion-level scale, but with the help of the Kunlun Chip Super Node, users can easily run such a large model with just a few minutes and one cloud instance.
In the inference system, the new version greatly improves throughput and reduces latency through three core strategies: "decoupling," "adaptive," and "intelligent scheduling." In addition, in the area of training-inference integration, Baidu also released the Baige Reinforcement Learning Framework, which can fully utilize computing resources, thereby significantly improving the efficiency of training and inference.