stub Google’s Project Suncatcher and the Rise of Orbital AI – Securities.io
Connect with us

Artificial Intelligence

Google’s Project Suncatcher and the Rise of Orbital AI

mm

Securities.io maintains rigorous editorial standards and may receive compensation from reviewed links. We are not a registered investment adviser and this is not investment advice. Please view our affiliate disclosure.

Summary:
Google’s Project Suncatcher explores placing AI data centers in orbit, using solar-powered satellites equipped with TPUs to bypass Earth’s energy constraints and enable continuous, high-density computation in space.

Moving AI To The Orbit

As AI booms, several supply constraints emerged. The first one was GPUs, with the specialized hardware moving from a niche gaming use to a mass adoption by AI data centers. As a result, Nvidia (NVDA +0.68%), the leader of the sector, has grown into the world’s largest company.

But another limitation is appearing: energy supply.

This is because AI data centers are now not so much measured by their computational power, but by their power consumption. This is why AI companies are scrambling to restart nuclear power stations, secure the first SMR prototypes, or state regulators are putting new gas-fed power plants on a fast track for approval.

As the rush to find energy for data centers is on, eyes are turning to another option: space-based solar power.

The possibility of an unlimited energy supply from orbital satellites is something we already analyzed extensively in “Space-Based Energy Solutions For Endless Clean Energy”.

But this concept is always somewhat limited by the need to convert solar energy into power, turn this electricity into microwave to beam it back down to Earth, and then convert it back into power.

This increases the complexity of the power satellites, requires more ground-based infrastructure, and overall reduce drastically the efficiency of the procedure, as each energy conversion leads to losses. So this could only work with very cheap orbital launches.

Alternatively, if the power was directly used in orbit, this would be a lot more efficient and become economically viable sooner. Especially if the final “product” can be easily sent back to Earth.

So in theory, data centers in space could be the ideal option: They need a lot of power, but sending back to Earth the result of the calculations is trivial and requires no new infrastructure, and does not cause energy losses.

Building on this idea, Alphabet/Google has just announced “Project Suncatcher”, looking our an orbital AI computation system would look like.

“Inspired by other Google moonshots like autonomous vehicles and quantum computing, we’ve begun work on the foundational work needed to one day make this future possible.

We’re exploring how an interconnected network of solar-powered satellites, equipped with our Tensor Processing Unit (TPU) AI chips, could harness the full power of the Sun.”

Why It Could Work?

A key part of why solar power is hard to use for data centers and AI is that these need a continuous, reliable power supply. Meanwhile, ground-based solar power is intermittent and stops working during the night.

But solar arrays located at the right orbit could produce 24/7 without any interruption or fluctuation of power. The direct exposure to sunlight also makes these panels a lot more productive.

“The Sun is the ultimate energy source in our solar system, emitting more power than 100 trillion times humanity’s total electricity production.

In the right orbit, a solar panel can be up to 8 times more productive than on Earth, and produce power nearly continuously, reducing the need for batteries.”

However, a few key technologies need to be developed and tested for any AI computation to work in orbit.

Orbital AI Key Challenges

High-Bandwidth Inter-Satellite Links for Orbital AI

Modern data centers are extremely complex, linking together thousands, or even millions of computing hardware pieces, with very demanding requirements on connectivity and reliability.

As our ability to send things into orbit is still limited to relatively small objects, any reasonably big data center in space will need to be made of a network of satellites communicating with each other.

Current inter-satellite link (ISL) technology only offers data rates in the range of 1–100 Gbps, much lower than the hundreds of gigabits per second per chip offered by Google’s low-latency optical Inter-Chip Interconnect (ICI) currently used in its AI data centers.

Instead, Google proposes to use Commercial Off-The-Shelf (COTS) Dense Wavelength Division Multiplexing (DWDM) transceiver technology.

This system works by assigning each signal to a specific, unique wavelength (color) of light within the infrared spectrum. This way, the same telescope can receive data from multiple satellites at once.

Source: Google

As the distance becomes very short (e.g., ∼10km for a 10 cm telescope), a bench-scale demonstrator using off-the-shelf components successfully achieved 800 Gbps unidirectional (1.6 Tbps bidirectional) transmission.

So in theory, off-the-shelf technology already exists for this kind of density in data transmission between orbital AI data center satellites.

Orbital Constellations

Most satellite constellations normally keep a wide distance between satellites in order to limit the risks of collision and maintain optimal orbital trajectories.

But the design proposed by Google for Project Suncatcher would require the constellation of data centers to be a lot closer to each other. For example, an 81-satellite constellation would be clustered in a sphere of 1km radius (3280 feet)

Source: Google

The company’s calculations indicate that such a constellation could be made stable, even accounting for imperfect orbital stability due to interference like atmospheric drag, solar radiation, cooling radiation, the Moon’s gravity, other satellites, etc.

This means that while not negligeable, the drift from the proper orbits should be manageable with conventional satellite technology

“For an example cluster as described, adjusting the axis-ratio to 2:1.0037 can reduce J2- drift to <3 m/s/year per km of maximal distance from reference orbit.”

The study also mention that there is probably an upper limit to how big such constellations can be, as at some point the satellites would start interfering with each other for sunlight capture or for evacuating waste heat.

Source: Google

Radiation Tolerance Of Hardware

Most computing hardware is vulnerable to radiation, with cosmic and solar radiations likely to turn randomly a “1” into a “0”, causing an error in the calculation.

For Project Suncatcher, Google is looking to use its own TPUs (Tensor Processing Units) called Trillium.

They tested Trillium’s resistance to space radiation by exposing it to a 67MeV proton beam,m testing for impact from total ionizing dose (TID) and single event effects (SEEs).

Of the different elements of the Trillium TPU, the High Bandwidth Memory (HBM) subsystems exhibited the most sensitivity to TID.

HBM was the most SEE-sensitive component, primarily manifesting as uncorrectable ECC errors (UECCs).

(HBM) Subsystems only began showing irregularities after a cumulative dose of 2 krad(Si), or nearly 3x the expected (shielded) five-year mission. No hard failures were attributable to TID up to the maximum tested dose of 15 krad(Si) on a single chip.

Overall, this came as a surprise and would indicate that TPUs are remarkably resistant to radiation and are an especially good fit for space-based data centers.

Economic Feasibility

So it seems that the existing technologies, from TPUs to satellite communication and mastery of orbital dynamics, are already enough to build data centers in space, at least when choosing the right design.

But of course, this will only matter if these data centers are economically competitive compared to Earth-based data centers.

Previous economic feasibility analyses of space-based solar power for Earth use tend to consider $500/kg to Geostationary Transfer Orbit (GTO) as a viability threshold for orbital energy projects, which is equivalent to ∼$200/kg to LEO (Low Earth Orbit).

Reaching that target will depend a lot on SpaceX’s ability to scale up production and the relaunch schedule of its largest rocket yet, Starship.

If the learning rate is sustained—which would require ∼180 Starship launches/year—launch prices could fall to <$200/kg by ∼2035.

At that price point, the cost of launching and operating a space-based data center could become roughly comparable to the reported energy costs of an equivalent terrestrial data center on a per-kilowatt/year basis.”

Overall, it seems that a rather high bar need in reduction of cost to reach orbit need to be achieved. But if the cost trajectory of the past decade stays true in this technology, this is not unrealistic either.

Investor Takeaway:
Orbital AI data centers remain a long-dated thesis, but Project Suncatcher signals how falling launch costs, space networking advances, and hyperscale AI demand could converge—benefiting satellite operators, launch providers, and space-data firms like Planet Labs.

Conclusion

Orbital data centers are unlikely to become a reality before 2030-2035, in large part due to the need to decrease further launch costs first.

This is not to say that experiments, tests, and prototypes will not make the idea progress further before that, as illustrated by Google Project Suncatcher.

It is likely that other prominent AI companies like Microsoft (MSFT +1.18%), OpenAI, Meta (META +0.32%), or Alibaba (BABA +0.22%) will also test their own version of this idea.

Two companies even likely to move quickly in that space are SpaceX, as Elon Musk is also the owner of xAI, and Amazon (AMZN +1.6%), as Jeff Bezos is right behind SpaceX with his own space company, Blue Origin.

Investing in Orbital AI Data Centers

Planet Labs

Besides Alphabet itself, an investment with a focus on the idea of space-based data centers would be Planet Labs. This is because it will be the partner that Google chose to work with on testing the technology for Project Suncatcher.

“Our next step is a learning mission in partnership with Planet to launch two prototype satellites by early 2027 that will test our hardware in orbit, laying the groundwork for a future era of massively-scaled computation in space.”

Planet Labs currently has a focus on Earth-observation satellites. The company owns a fleet of approximately 200 Earth imaging satellites, the largest in history, imaging the whole Earth’s land mass daily.

These images are high-resolution and include hyperspectral data (visible + infrared and UV light), making them useful for geodesy, agriculture, insurance, finance, and governments (including military applications).

They can be used for monitoring, disaster response (wildfire, tornadoes, etc.), defense & intelligence, mapping infrastructures, detecting methane emissions, etc.

 

Source: Planet Labs

The company offers transparent pricing, with different subscriptions depending on the regions of the world covered and the number of square kilometers of surface demanded. 90% of revenue is recurring and from annual or multi-year contracts.

Source: Planet Labs

Planet Labs recorded $245 million in revenues in the 2025 fiscal year, doubling from $122 million in 2022, with record revenues in Q1 2026 and an adjusted EBITDA turning positive for the first time in Q4 2025.

The largest source of revenues is the North American region (45%), and the defense and intelligence segment represents more than half of revenues.

Source: Planet Labs

As a trusted provider of data, Planet Labs could benefit from a few trends, irrespective of where the space industry goes:

  • It can license out the images to AI companies, or use them itself to train its own AIs, both for better real-time monitoring and novel insights.
  • It will benefit from the price war between launch providers like SpaceX, Relativity Space, and Rocket Labs, making the maintenance and replacement of its satellite fleet cheaper.
  • It will benefit from the economies of scale in satellite manufacturing, making new, more capable models cheaper, as it demonstrated with the recent addition of hyperspectral data to its offerings.
  • Larger launch vehicles should enable the conception of larger, more capable satellites, with potentially much longer lifespans, as this is mostly determined by the volume of fuel that the satellite can contain and use to maintain a stable orbit.

It seems that experience in creating and operating an orbital AI data center jointly with Google will also be added in less than 2 years.

Overall, Planet Labs is an interesting stock to bet on a growing orbital economy, besides the obvious position of stocks of rocket companies like SpaceX (likely to IPO in 2026) or Rocket Labs (RKLB -0.34%).

(You can read more about Planet Labs’ business model and future in our investment report dedicated to the company.)

Jonathan is a former biochemist researcher who worked in genetic analysis and clinical trials. He is now a stock analyst and finance writer with a focus on innovation, market cycles and geopolitics in his publication 'The Eurasian Century".

Advertiser Disclosure: Securities.io is committed to rigorous editorial standards to provide our readers with accurate reviews and ratings. We may receive compensation when you click on links to products we reviewed.

ESMA: CFDs are complex instruments and come with a high risk of losing money rapidly due to leverage. Between 74-89% of retail investor accounts lose money when trading CFDs. You should consider whether you understand how CFDs work and whether you can afford to take the high risk of losing your money.

Investment advice disclaimer: The information contained on this website is provided for educational purposes, and does not constitute investment advice.

Trading Risk Disclaimer: There is a very high degree of risk involved in trading securities. Trading in any type of financial product including forex, CFDs, stocks, and cryptocurrencies.

This risk is higher with Cryptocurrencies due to markets being decentralized and non-regulated. You should be aware that you may lose a significant portion of your portfolio.

Securities.io is not a registered broker, analyst, or investment advisor.