OpenAI recently announced that it will not adopt Google's TPU chips on a large scale, despite initial testing. TPU (Tensor Processing Unit) is a custom ASIC chip designed by Google for machine learning tasks, aiming to accelerate the training and inference of neural networks. TPU uses a dataflow-driven architecture, enabling efficient matrix multiplication pipeline computing, thus reducing memory access latency.

AI-generated chip image (1)

Image source note: The image was generated by AI, and the image licensing service is Midjourney.

OpenAI said it will continue to rely on NVIDIA's GPUs and AMD's AI accelerators to support the training and inference of its models. This is because the products of these two companies have been verified, and OpenAI has already established supply agreements with them. Although some media reports mentioned that OpenAI has started using Google's AI chips for certain tasks, insiders pointed out that these are low-performance TPUs, while Google's latest chips, specifically designed for its Gemini large language model, will continue to be used internally.

OpenAI's collaboration with Google Cloud meets its broader infrastructure needs, but the company clearly stated that it will not transfer a large amount of computing power to the TPU platform in the short term. Investors and analysts previously viewed the potential TPU collaboration as a signal that OpenAI was seeking alternatives to NVIDIA. Morgan Stanley strategists even believed that this move could effectively validate the competitiveness of Google's hardware in the market.

However, OpenAI's statement reflects its close relationship with existing chip partners and the complexity of large-scale deployment of new hardware. In the context of continuously increasing demand for AI computing power, OpenAI seems to prefer gradually expanding based on the existing GPU-TPU hybrid test, rather than fully transitioning to the TPU architecture. This chip roadmap sends a message to the market that NVIDIA and AMD will remain OpenAI's core suppliers, which may limit Google's growth potential in the AI hardware market, despite continuous advancements in TPU technology.

In the future, investors will closely watch OpenAI's next infrastructure update and Google Cloud's financial report to detect signs of changes in TPU usage or new supplier diversification.

Key points:  

🌟 OpenAI is temporarily pausing large-scale use of Google TPUs and continues to rely on NVIDIA and AMD.  

📈 TPU chips have undergone preliminary testing and do not have plans for large-scale deployment.  

🔍 Investors will monitor OpenAI's infrastructure updates and Google Cloud's financial reports to understand TPU usage.