Cerebras Systems, an AI chipmaker, filed an SEC prospectus on April 17[1] to pursue a $3 billion IPO[2] after scrapping a 2025 plan[1].

The move matters because the company hopes to scale production for its wafer‑scale engines while leveraging new contracts with OpenAI and Amazon Web Services, two of the fastest‑growing AI users. The funding could also support R&D for next‑gen chips and future innovations.

Cerebras first announced an IPO in 2024, but withdrew the filing in 2025[1] after market volatility and investor concerns about the timing of its custom‑chip roadmap. Reviving the IPO after a 2025 withdrawal signals confidence in AI‑chip demand.

The revived filing comes after the firm closed two[3] major partnership deals—one with OpenAI and another with AWS, giving it access to the cloud providers’ most demanding inference workloads[3]. The company secured major deals with OpenAI and Amazon Web Services.

Cerebras is targeting a $3 billion valuation[2] as it goes public. Analysts see the $3 billion price tag as modest for a company that claims its single chip can deliver petaflops of performance, a claim that could position it as a key supplier for next‑generation large language models.

The prospectus, submitted to the U.S. Securities and Exchange Commission, outlines a planned offering of both primary shares and secondary stock sold by existing investors, and sets a timeline that could see shares debuting on a U.S. exchange before the end of the year[1].

Cerebras competes with established AI hardware firms such as Nvidia and AMD, whose GPUs dominate most data‑center workloads. Its wafer‑scale engine, however, promises higher throughput per chip, a claim that could attract customers seeking to reduce latency in large language model training.

Wall Street analysts have noted that the $3 billion valuation is lower than some peers’ market caps, suggesting the company may be seeking a quick public debut to fund its next generation of chips. The dual‑class offering structure, common among tech IPOs, allows early investors to retain voting control.

If the SEC clears the filing without major comments, Cerebras could price its shares in the summer and begin trading before the end of 2026, positioning itself ahead of the next wave of AI model deployments.

Cerebras’s flagship wafer‑scale engine houses more than 100,000 cores on a single silicon wafer, a density that dwarfs conventional GPUs. The architecture is designed to keep data on‑chip, reducing the need for costly data‑center interconnects.

Industry observers expect demand for custom AI silicon to grow as enterprises train larger models, making Cerebras’s timing potentially advantageous.

Investors will watch the company’s ability to meet production targets, as any delays could erode confidence and affect the IPO price.

Cerebras is targeting a $3 billion valuation as it goes public.

Reviving the IPO demonstrates Cerebras’s confidence that its wafer‑scale chips can capture a share of the rapidly expanding AI‑hardware market, especially as cloud providers like OpenAI and AWS seek custom silicon to lower inference costs. Successful fundraising could accelerate product roll‑outs, but the company must meet aggressive production targets to justify the modest $3 billion valuation and sustain investor interest.