AI will need large amounts of electricity in the coming years, but it doesn’t need to necessarily be generated close to where humans live.
NVIDIA CEO Jensen Huang has proposed an unconventional solution to one of artificial intelligence’s most pressing challenges: the enormous energy requirements for training advanced AI models. Rather than building data centers near population centers and straining existing power grids, Huang suggests a paradigm shift—training AI in remote locations with abundant renewable energy, then deploying the trained models wherever they’re needed. His vision reframes the geography of AI infrastructure around energy availability rather than human proximity.

“AI doesn’t care where it goes to school,” Huang explained. “The world actually has an abundance of energy. It’s just in the wrong place. The amount of energy that we get from the sun, the amount of energy we get from thermal, from sustainable energy is quite high. It’s just in the wrong place. It’s in places where people don’t live. They’re in deserts. They’re in places that are quite cold. They’re in places that are not the most delightful places with wonderful climate and beachfront properties and things like that.”
Huang’s proposal is straightforward in its logic: “We need to go make sure that those areas have power generation capabilities. Then we can put data centers there. We could train our AI models there. And when they ‘graduate’, we bring the AI models back and use it wherever we like to use it.”
The NVIDIA chief emphasized that this approach eliminates the need for complex energy storage infrastructure. “I think in the future you’ll probably see data centers being built in countries that are currently far away from population, far away from the power grid. And there’s no reason to build batteries. Just use the energy, train AI models, and bring the trained AI model back to population.”
By decoupling AI training from deployment, the technology industry could tap into vast renewable energy resources in inhospitable regions—solar farms in deserts, wind installations in remote coastal areas, or geothermal plants in volcanic regions—without the logistical challenges of transmitting that power across continents or storing it in massive battery arrays. Once trained, AI models are essentially software that can be transmitted digitally anywhere in the world instantaneously, making the physical location of training infrastructure irrelevant to end users.
This concept of remote AI infrastructure is gaining traction beyond Earth itself. Amazon founder Jeff Bezos recently predicted that AI data centers would be operating in space within 20 years, taking the logic of remote training to its ultimate conclusion. Notably, NVIDIA has backed this vision with capital, funding Starcloud, a startup developing space-based AI data center technology. These orbital facilities would harness unfiltered solar energy and eliminate cooling costs in the vacuum of space, while keeping power-hungry operations literally above the demands of terrestrial energy grids. As AI models continue to grow in size and capability, Huang’s philosophy of training where energy is abundant—whether in Earth’s deserts or in orbit—may become not just practical, but essential.