Google is taking AI infrastructure to new heights—literally. The tech giant announced Project Suncatcher on November 4, 2025, a moonshot initiative to build solar-powered satellite constellations equipped with its Tensor Processing Units (TPUs) for machine learning compute in space.
The ambitious project will reach its first major milestone in early 2027, when Google plans to launch two prototype satellites in partnership with Planet, a satellite imaging company. The mission will test whether TPUs can operate effectively in the harsh conditions of low-Earth orbit and validate the use of optical inter-satellite links for distributed ML workloads.

“Our TPUs are headed to space! Inspired by our history of moonshots, from quantum computing to autonomous driving, Project Suncatcher is exploring how we could one day build scalable ML compute systems in space, harnessing more of the sun’s power (which emits more power than 100 trillion times humanity’s total electricity production),” said Sundar Pichai, Google’s CEO, in a statement announcing the project.
Why Space?
The rationale behind Project Suncatcher centers on energy availability and efficiency. In the right orbit, solar panels can be up to eight times more productive than on Earth and produce power nearly continuously, eliminating the need for heavy battery systems. Google’s proposed system would operate in a dawn-dusk sun-synchronous low-Earth orbit, where satellites would receive near-constant sunlight.
The company envisions compact constellations of networked satellites connected by free-space optical links, creating what could eventually become data center-scale infrastructure in space. This approach, Google argues, would minimize impact on terrestrial resources while unlocking tremendous potential for scale.
Technical Breakthroughs and Challenges
Google has already made significant progress on several technical fronts. The company’s research team achieved 800 Gbps each-way transmission (1.6 Tbps total) using a single transceiver pair in a bench-scale demonstration. To achieve data center-scale performance, the satellites would need to fly in extremely tight formations—within kilometers or even hundreds of meters of each other—far closer than any current satellite system.
Perhaps most surprisingly, Google’s Trillium v6e Cloud TPUs proved remarkably radiation-resistant when tested in a 67MeV proton beam. The chips showed no hard failures from total ionizing dose effects up to 15 krad(Si), nearly three times the expected five-year mission dose for a shielded system in low-Earth orbit.
“Early research shows our Trillium-generation TPUs survived without damage when tested in a particle accelerator to simulate low-earth orbit levels of radiation,” Pichai noted. “However, significant challenges still remain like thermal management and on-orbit system reliability.”
The company’s analysis also suggests the economics may be approaching viability. With projected launch costs potentially falling below $200 per kilogram by the mid-2030s, the cost of launching and operating a space-based data center could become roughly comparable to the energy costs of an equivalent terrestrial facility.
A Growing Trend
Google isn’t alone in eyeing space for AI infrastructure. Jeff Bezos has predicted that AI datacenters will operate in space within 20 years, while Starcloud, a startup backed by both NVIDIA and Google, is also developing plans for space-based datacenters.
The convergence of falling launch costs, improving space technology, and exponentially growing demand for AI compute is creating what some see as an inevitable shift toward space-based infrastructure. Project Suncatcher represents Google’s long-term bet on this future.
“Like any moonshot, it’s going to require us to solve a lot of complex engineering challenges,” Pichai acknowledged. “More testing and breakthroughs will be needed as we count down to launch two prototype satellites with Planet by early 2027, our next milestone of many.”
The 2027 learning mission will be crucial in validating Google’s models and proving that distributed machine learning workloads can run effectively across satellite constellations. If successful, it could pave the way for gigawatt-scale constellations that fundamentally reshape how and where AI computation happens.
Travis Beals, Senior Director of Paradigms of Intelligence at Google, who is leading Project Suncatcher, emphasized the exploratory nature of the effort: “Our initial analysis shows that the core concepts of space-based ML compute are not precluded by fundamental physics or insurmountable economic barriers.”
As Google prepares for its 2027 launch, the company joins a growing field of organizations betting that the future of AI—quite literally—lies in the stars.