In the field of AI, the end of computing power is often considered to be energy. To support its ambitious infrastructure, OpenAI is engaged in an epic energy battle with the fusion startup Helion Energy. According to recent disclosures, the two parties are negotiating a decade-long power procurement agreement: OpenAI plans to obtain 5 gigawatts (500 megawatts) of electricity by 2030 and eventually expand to an astonishing 50 gigawatts by 2035.

Fusion: The "Holy Grail" of AI's Long Journey

Training large models and maintaining inference services are truly "power-hungry" operations. Currently, just running ChatGPT consumes as much electricity as the total consumption of 33,000 U.S. households per day. As model sizes move toward trillions of parameters, traditional power grids are struggling to handle this exponential growth in demand.

Fusion technology, which mimics the energy generation process of the sun, is seen as the ultimate solution for providing unlimited clean energy. Helion claims to be approaching "scientific breakeven," where the energy produced exceeds the energy consumed. Although this goal has no precedent in the private sector, OpenAI's involvement undoubtedly injects the strongest catalyst into this "hard tech."

Big Players Enter and Power Struggles

Behind this energy race lies the preemptive positioning of tech giants for future computing dominance. In addition to OpenAI, Microsoft and Google are also actively investing in fusion, aiming to take the initiative in global energy transition.

Notably, OpenAI CEO Sam Altman was one of the lead investors in Helion. To avoid public concerns about potential conflicts of interest, Altman has resigned from his position as chairman of Helion's board and has avoided related negotiations. This vertical integration, extending from model development to underlying energy supply, reveals a harsh truth in the AI industry: future competition is not only about algorithms but also about the ultimate control of physical world resources (electricity and chips).