Jeff Bezos has predicted datacenters in space in 20 years, but Elon Musk seems to do a lot better.
In a recent conversation with NVIDIA CEO Jensen Huang, Tesla and SpaceX founder Elon Musk made a striking prediction that could reshape the future of artificial intelligence infrastructure. While discussing the escalating challenges of power generation and cooling for AI systems, Musk argued that space-based AI compute will become not just viable, but economically superior to ground-based alternatives within just five years. His thesis centers on a simple but profound reality: space offers unlimited solar power, no cooling infrastructure, and none of the electrical grid constraints that are rapidly becoming bottlenecks for AI development on Earth.

The conversation began with Huang illustrating the absurdity of current AI infrastructure design. “Just look at the supercomputers we’re building together,” he told Musk. “Let’s say each one of the racks is two tons. Out of that two tons, 1.95 of it is probably for cooling. Just imagine how tiny that little supercomputer is. Each one of these GB 300 racks would just be a little tiny thing.”
This observation provided the perfect setup for Musk’s central argument. “My estimate is that the cost effectiveness of AI in space will be overwhelmingly better than AI on the ground, far long before you exhaust potential energy sources on Earth,” Musk said. “Long, long before, meaning I think even perhaps in the four or five year timeframe, the lowest cost way to do AI compute will be with solar powered AI satellites. So I’d say not more than five years from now.”
Musk then broke down the mathematics that make terrestrial AI compute increasingly untenable at scale. “Electricity generation is already becoming a challenge. So if you start doing any kind of scaling for both electricity generation and cooling, you realize space is incredibly compelling,” he explained. “Let’s say you wanted to do two or 300 gigawatts per year of AI compute. It’s very difficult to do that on Earth. The US average electricity usage, last time I checked, was around 460 gigawatts per year average usage. So something like 300 gigawatts a year would be like two thirds of US electricity production per year. There’s no way you’re building power plants at that level.”
The scale problem only gets worse from there. “And then if you take it up to say a terawatt per year, impossible,” Musk continued. “You have to do that in space. There just is no way to do a terawatt per year on Earth. And in space, you’ve got continuous solar, you don’t actually need batteries because it’s always sunny in space. And the solar panels actually become cheaper because you don’t need glass or framing. And the cooling is just radiative.”
The implications of Musk’s prediction are profound, and he’s not alone in seeing this future. Jeff Bezos recently forecast that we’ll have AI datacenters in space within 20 years, while NVIDIA-backed startup Starcloud announced plans to build orbital datacenters in late October. Meanwhile, Google has revealed Project Suncatcher, an initiative to send TPUs to space in early 2027 to scale machine learning compute. The convergence of these announcements from tech’s biggest players suggests this isn’t science fiction but an emerging consensus about AI’s necessary trajectory. If Musk’s timeline proves accurate, the race to orbit may define which companies lead the next era of artificial intelligence—and it could happen far sooner than anyone expected.