Baidu announced that its search AI assistant has fully launched a new ultra-fast model. The introduction of this self-developed technology marks a qualitative leap in the speed of generating search results. According to the latest data and tests, compared to the previous DeepSeek V3.1, the new ultra-fast model improves the speed of generating search results by five times while maintaining the same level of performance, and reduces the call cost to just 70%.
The core of the new technology lies in the comprehensive optimization of the large model reasoning process. This optimization covers generation speed, call cost, and usage effectiveness, ensuring that users can get the most accurate information in the shortest time possible. Specifically, the first token return time of the Baidu AI assistant has been shortened by 39%, and the response speed has increased by 500%. These data demonstrate that Baidu has taken the lead in search efficiency within the industry.