NVIDIA-supported startup Starcloud has recently successfully trained a large language model (LLM) in space for the first time, marking a significant step toward space-based data centers. As the demand for computing power and energy increases, utilizing space resources is becoming a future direction.
Starcloud successfully launched its Starcloud-1 satellite last month, which is equipped with an NVIDIA H100 GPU, completing the training of the nano-GPT model developed by Andrej Karpathy and performing inference on Google DeepMind's Gemma model. Philip Johnston, founder and CEO of Starcloud, said on LinkedIn: "We have just successfully trained the first LLM in space using NVIDIA H100! We are also the first team to run a version of Google's Gemma in space!"
Johnston further pointed out that this achievement marks a major advancement in moving computing resources to space, aiming to reduce the consumption of Earth's energy resources and utilize almost unlimited solar energy. Adi Oltean, Chief Technology Officer of Starcloud, also stated that operating the H100 in space required innovation and effort from the company's engineering team, and the team plans to test more models in the future.
Starcloud was established in 2024, advocating for the establishment of space computing centers to address environmental pressures faced by traditional data centers. The International Energy Agency predicts that data center electricity consumption will double by 2030. Moreover, terrestrial facilities face water shortages and rising emissions, while space platforms can use uninterrupted solar energy, avoiding cooling challenges.
The company plans to build a fully solar-powered space data center with a capacity of 5 gigawatts, covering an area of four kilometers, which is expected to surpass the largest power plants on Earth, while being more cost-effective and compact than ground-based solar farms of the same scale.
In addition to Starcloud, companies such as Google, SpaceX, and Blue Origin, founded by Jeff Bezos, are also exploring the possibility of space-based data centers. Google recently announced the Suncatcher project, aiming to place AI data centers in orbit, using satellites and high-throughput optical connections to form distributed computing clusters. Google CEO Sundar Pichai called it the "Moon Project," with plans for early testing in 2027.
At the same time, SpaceX also plans to use the next generation of Starlink satellites to establish a space data center and expects to become the lowest-cost AI computing option within the next five years.
Key Points:
🌌 Starcloud has trained a large language model in space for the first time, marking a major step towards space-based data centers.
☀️ The company plans to build a 5-gigawatt solar-powered data center, which is expected to surpass the largest ground-based power plants.
🚀 Many companies are exploring the possibility of space-based data centers, driving rapid development in space computing.





