Anthropic PBC is in early talks with investors to raise at least $30 billion [1] in new financing.

This capital injection would place the San Francisco-based company at a valuation of $900 billion [1], signaling a massive increase in investor appetite for large-scale artificial intelligence development. The funding is intended to support the growth of the company's AI models and position the firm for a potential initial public offering.

While some reports suggest the company is seeking $30 billion [1], other estimates indicate the total could reach as high as $50 billion [7]. Valuation estimates also vary slightly, with some sources placing the company's value at $950 billion [8].

Major technology firms have already earmarked significant sums for the company. Google has earmarked $10 billion [3], while Amazon has set aside $5 billion [4]. These commitments may increase if specific targets are met, with potential additional funding of $30 billion from Google [5] and $20 billion from Amazon [6].

The scale of this fundraising reflects the high cost of computing power and talent required to maintain a competitive edge in the AI race. By securing these funds, Anthropic aims to accelerate the development of its next-generation models, a move that would solidify its position as a primary competitor to other leading AI labs.

As the company eyes a potential IPO, the current funding round serves as a critical benchmark for its market value. The involvement of cloud giants like Amazon and Google suggests a strategic interdependence between AI model developers and the infrastructure providers that host them.

Anthropic is in early talks with investors to raise at least $30 billion in new financing.

The potential $900 billion valuation suggests that investors are pricing AI companies not just as software firms, but as foundational infrastructure for the future global economy. The heavy involvement of Amazon and Google indicates a 'cloud-for-equity' cycle where AI labs trade ownership for the massive compute resources needed to train larger models, creating a tight loop between the builders of AI and the providers of the hardware.