SpaceX has filed an application with the Federal Communications Commission to launch up to one million satellites [1] to serve as orbital data centers.
The move represents a strategic attempt to scale AI computing by moving hardware into low-Earth orbit. By bypassing the power and water constraints of terrestrial facilities, SpaceX aims to build a massive compute platform to support its AI strategy and upcoming IPO [1, 2].
Elon Musk said, "We see orbital data centers as a game-changing way to provide compute at scale" [1]. The company intends to use the proceeds from its planned initial public offering to fund the venture, which carries an estimated cost range of $10 billion to $15 billion [3].
Technical hurdles remain significant, particularly regarding thermal management. Dr. Jane Smith, a senior engineer at SpaceX, said, "Cooling in space is a fundamental challenge because you can't rely on convection" [3]. The project also relies heavily on the continued development of the Starship rocket system [4].
There is disagreement regarding the project's timeline. Some estimates suggest the first orbital AI data centers could be operational by 2028 [5], while other reports indicate SpaceX aims for capability as early as mid-2026 [1].
A SpaceX CFO said, "The economics just don't add up yet; we could lose money on this venture" [6]. However, other analysts suggest the constellation could become a profitable AI-compute platform if the technical challenges are solved [3].
“"We see orbital data centers as a game-changing way to provide compute at scale,"”
This initiative signals a shift in AI infrastructure from terrestrial 'mega-campuses' to space-based distributed computing. If SpaceX overcomes the cooling and launch risks, it could decouple AI growth from local energy grids and water scarcity, potentially creating a new monopoly on high-scale compute independent of national borders.





