According to DCD, Google is launching its TPU AI chips into space through Project Suncatcher in partnership with Planet Labs, with two satellites planned by early 2027. The company theorizes building massive 81-satellite clusters forming kilometer-wide computing arrays in low-earth orbit. Google CEO Sundar Pichai acknowledged this “moonshot” requires solving complex engineering challenges, while the company’s research shows Trillium-generation TPUs survived radiation testing simulating five years in orbit. The project faces significant technical hurdles including thermal management, networking between satellites, and dealing with space radiation effects on high-bandwidth memory. This comes as multiple companies including SpaceX and Blue Origin also pursue space data centers, with Jeff Bezos predicting gigawatt installations within 10+ years.
Why even bother with space computing?
Here’s the thing – everyone’s chasing the same prize. We’re hitting physical limits on Earth for AI compute, and space offers essentially unlimited real estate and free solar power. But the economics are brutal right now. Google‘s own analysis shows launch costs need to drop from today’s $1,500-$2,900 per kilogram down to around $200 per kilogram to even start making sense. They’re banking on SpaceX’s Starship flying 180 times per year by 2035 to hit that target. Meanwhile, terrestrial data center power costs keep climbing. It’s a classic long-term bet where the payoff could be enormous if the pieces fall into place.
The engineering nightmare
So how do you actually build a data center that floats? Google’s approach is fascinating – instead of one massive satellite, they want dozens of smaller ones flying in tight formation. Current satellite constellations maintain pretty loose spacing, but Google needs them within hundreds of kilometers or less to make the networking work. They’re talking about using optical inter-satellite links with dense wavelength division multiplexing to hit the 10Tbps bandwidth they need. Basically, they’re trying to recreate the high-speed networking of terrestrial data centers… in vacuum.
Then there’s radiation. Google tested their chips in particle accelerators and found the TPUs themselves held up pretty well. But the high-bandwidth memory? That’s the weak spot. They’re seeing uncorrectable errors at rates that might be acceptable for AI inference workloads, but training? That’s still a big question mark. And when something breaks 650 kilometers up, you can’t just send a tech to replace it. Their solution? Over-provision with redundant hardware. Not exactly elegant, but sometimes brute force works.
The cooling and power reality
Think about this – in space, you have no air for cooling. All that heat from thousands of TPUs has to go somewhere. Google’s paper mentions “advanced thermal interface materials” and passive heat transport systems moving heat to dedicated radiator surfaces. Passive systems are crucial because fewer moving parts means higher reliability when you can’t send maintenance crews. And power? They’re counting on massive solar arrays, but the efficiency of those systems in space environments over years of operation is another huge unknown.
The new space race implications
This isn’t just Google dreaming big. Elon Musk says SpaceX “will be doing” space data centers. Jeff Bezos is talking gigawatt-scale installations. Former Google CEO Eric Schmidt bought a rocket company specifically for this purpose. We’re seeing a fundamental shift in how tech giants think about infrastructure scaling. The Project Suncatcher paper lays out a vision that’s simultaneously ambitious and cautiously realistic about the challenges.
What’s really interesting is the timing. Early 2027 for the first two satellites means they’re moving faster than you might expect for something this complex. They’re clearly betting that the technical and economic pieces will fall into place in the next few years. Whether this becomes the future of computing or remains a fascinating research project, it’s pushing boundaries in ways that could benefit computing everywhere.
