The AI hype takes off: Tech giants race for AI factories in space

Artificial intelligence consumes enormous amounts of electricity. A single Microsoft AI data center, scheduled to go online in 2026, is expected to consume half the electricity of the city of Zurich. And these "AI factories" are currently being built at a dizzying pace. The AI hype thrives on increasing computing power. 800 million people use Google Chat every week. Millions more play around with AI video apps. And recently, web browsers have also been based on AI. The servers of AI companies are running hotter and hotter.
NZZ.ch requires JavaScript for essential functions. Your browser or ad blocker is currently preventing this.
Please adjust the settings.
OpenAI alone plans to invest more than a trillion dollars in building new data centers over the next few years. Meta, Amazon, Google, and Elon Musk's XAI have also succumbed to the data center craze. Some experts are puzzled as to where so much electricity for these sprawling server farms will come from.
Greedy tech companies: The «hyperscalers»
The tech giants are planning an unprecedented expansion of their IT infrastructure to handle the overwhelming mountain of data generated by their AI services.
Jeff Bezos believes that in ten to twenty years, AI factories will be built not on Earth, but in space. Sam Altman, head of OpenAI, also likes the idea. Former Google CEO and investor Eric Schmidt bought a rocket startup to launch AI supercomputers into space. And Elon Musk, with SpaceX, is also aiming to build orbital data centers.
However, some experts doubt that we will run out of electricity on Earth for AI data centers; others are pinning their hopes on nuclear power. Still others fundamentally question the usefulness of AI and thus its commercial success. AI optimists, on the other hand, believe the technology's boom is unstoppable. In the long run, they argue, it will be virtually impossible to avoid using inexpensive solar energy in space for this purpose.
Among those predicting an explosion in electricity demand due to AI is Philip Johnston, CEO of the startup Starcloud. "In the next three years alone, AI will require the power output of ten new nuclear power plants," he says. The rapidly advancing expansion of data centers is already driving up electricity prices significantly in the US.
The AI hype is reaching astronomical dimensions.
Johnston's startup Starcloud plans to build an orbital data center with a massive solar power plant by around 2035. Several rocket launches will be necessary to transport all the servers and the required solar panels into space. Robots would also be launched into space to assemble the data center in orbit and perform repairs.
The extraterrestrial data center is expected to require 5 gigawatts of electricity. That's equivalent to the output of five commercial nuclear power plants.
Since they are to be powered by solar energy, the satellites must be permanently pointed towards the sun. The chosen orbit is designed to ensure that the satellites never enter the Earth's shadow.
Cooling the computers could pose a problem. AI processors generate a lot of heat. And in space, there's no air to dissipate some of the heat from hot chips on Earth. In orbit, cooling therefore relies solely on thermal radiation. To maximize this, the data center needs a massive radiator – similar to a heating element – to radiate the excess heat into the airless environment.
According to Johnston, Starcloud plans to build by far the largest radiator for space. The radiator will be ten times lighter and one hundred times cheaper than the best available radiator offering the same cooling capacity. Johnston declined to reveal further technical details, stating that these are Starcloud's core trade secret.
One open question is whether the processors can withstand the radiation in space. On Earth, the atmosphere and our planet's magnetic field provide natural protection. But in space, this shield disappears. The radiation can damage the electronics and lead to the loss of stored information. Starcloud plans to test the radiation resistance of its AI processors with a test satellite called Starcloud-1.
A test satellite launches with Mini-GPT
Starcloud-1 was launched into orbit 350 kilometers above Earth on November 2nd by a SpaceX rocket. The 60-kilogram test satellite, about the size of a small refrigerator, carries an Nvidia H-100 processor. Hundreds of thousands of chips of this type are still being used by companies like OpenAI to develop the latest AI chatbots in terrestrial data centers. The single chip in Starcloud-1 will be sufficient to run a small AI model called Mini-GPT.
In October 2026, the startup's first commercial satellite, Starcloud-2, is scheduled to enter orbit. It will consume 7 kilowatts of power, which will be supplied by an 80-square-meter photovoltaic system. Johnston says Starcloud-2 will provide computing capacity for American military satellites. The military is prepared to pay higher prices for these services than commercial customers, says the Starcloud CEO. He expects that Starcloud-2 alone will generate more revenue than the design, construction, and launch costs of the satellite itself.
Johnston and his team also have to find solutions to the problem of space debris. Protecting the Starcloud data centers, with their gigantic solar panels, from collisions with debris from decommissioned satellites will be anything but trivial. Especially at altitudes between 400 and 800 kilometers above Earth, satellites like those of Starlink are taking up more and more space. Therefore, Starcloud plans to orbit its satellites at an altitude of around 1,300 kilometers above Earth in the long term, where the risk of collision is significantly lower. This would result in slightly greater delays in data transmission to Earth compared to the lower orbit of Starcloud-1. Johnston says that even then, the data transfer would be fast enough for the use of AI.
Resource consumption and costs
Starcloud is betting that launching satellites into space will become significantly cheaper over time. Currently, anyone wanting to launch a satellite into space has to pay several thousand dollars per kilogram. According to Johnston, however, an orbital data center will only become profitable when the price of launching into space drops to $500 per kilogram of payload. He hopes that Starship, SpaceX's new rocket, will undercut these costs.
A study by the French aerospace company Thales Alenia last year examined the environmental impact of orbital data centers. According to the study, orbital servers would be beneficial for the environment – provided that the emissions from the rockets themselves are reduced tenfold. The study authors anticipate that Europe could build and operate a climate-neutral data center in orbit by 2050.
The international race to the space cloud
Starcloud's plans sound like science fiction. The abundance of solar energy above Earth's atmosphere is certainly a compelling argument. But operating data centers in space would likely be far more difficult than on Earth. Philip Johnston and his colleagues, however, are not the only ones who believe in the vision.
The Chinese space company Adaspace launched twelve satellites equipped with AI processors into space in May. The goal is to eventually create a gigantic network of data centers consisting of 2,800 satellites.
The Abu Dhabi-based startup Madari Space plans to launch an orbital data center by the end of 2026. Madari's primary focus, however, is not on AI chatbots, but rather on processing data from Earth observation satellites. Instead of raw data, only the results of the analyses would be transmitted to Earth. Transferring these smaller datasets would be significantly faster, thus reducing response times to wildfires or floods. Madari is backed by an innovation fund established by Mohammed bin Rashid Al Maktoum, the ruler of the Emirate of Dubai.
The Polish company KP Labs and IBM are collaborating on a similar concept to Madari. And in August, the American company Axiom Space installed its own computing module on the International Space Station (ISS). By the end of the year, Axiom plans to launch two more modules into orbit around the Earth. In the future, the company intends to create an orbital computing network that will process data for the American military as well as for a commercial space station.
Realistic future scenario or "science fantasy"?
A kind of computer race into space is already underway. However, some experts warn against excessive short-term expectations.
“The data centers we currently have on Earth are very large. Bringing something similar into space would be an enormous undertaking,” says Malcolm Macdonald, a space expert and professor at the University of Strathclyde in Glasgow, Scotland. Macdonald doubts that data centers in space can be profitable in the near future. But he concedes that this could change when SpaceX’s Starship—currently just a prototype—begins regular operation.
Space researcher Michael Gschweitl from ETH Zurich believes that data centers in space are feasible. "The development of small-scale space computers is already well advanced – for example, for analyzing data from Earth observation satellites," he says. A large data center with a capacity of 5 gigawatts, as planned by Starcloud, is physically possible, but a much more difficult engineering challenge. Gschweitl doesn't dare to predict exactly when this will become a reality. "It's very likely, however, that something like this won't be achieved by a small startup, but rather by a larger, established space company," says the ETH researcher.
Sources: Visualizations; Starcloud , Mesa satellite images; images from Google Earth, inspired by FT .
nzz.ch



