In a romanticized article about the "singularity," OpenAI CEO Sam Altman inadvertently revealed an interesting detail about the energy consumption of artificial intelligence: a single ChatGPT request consumes on average 0.34 kWh of electricity and 0.000085 gallons of water. At first glance, this number seems negligible, roughly equivalent to the energy consumption of a Google search in 2009. However, while Altman emphasized this "low energy consumption," he overlooked a key factor: the average number of requests processed per person for ChatGPT may far exceed the average level of Google searches back then.
Altman's "romanticized" description might aim to highlight efficiency improvements brought by technological progress. However, with the explosive growth in computational demand for new AI models such as multimodal systems, intelligent agents, and advanced reasoning engines, the rapid expansion of data centers is an undeniable fact. The construction and operation of these massive infrastructures clearly indicate that the overall energy demands of AI systems will continue to rise.
We cannot simply compare the energy consumption of a single ChatGPT query with past Google searches while ignoring the enormous task volumes and complexities carried by current AI applications. Whether it's text generation, image recognition, or more complex decision support, each step relies on high-intensity computing. This insatiable demand for computing power not only drives the continuous expansion of data centers but also poses significant energy challenges.
Therefore, the energy consumption data mentioned by Altman is less comforting than a reminder. It prompts us to deeply consider: how to effectively manage and reduce the growing energy consumption of AI, while pursuing its limitless potential, will be an unavoidable major issue in future technological development. This is not just a technical challenge, but also relates to sustainable development and global energy strategy planning.