Anthropic’s growth and AWS advantage
Anthropic, which develops the Claude family of generative AI models, announced Tuesday that it closed a $13 billion funding round at a post-money valuation of $183 billion—nearly triple its worth in March.
?
Amazon’s total investment in the startup now stands at $8 billion, following its additional $4 billion commitment last November.
?
The startup primarily trains its Claude models on AWS infrastructure, utilizing Amazon’s in-house Trainium and Inferentia chips for training and inference workloads.
?
This integration provides Amazon with both a technology showcase and a growing revenue stream.
?
Barclays analyst Ross Sandler noted in a Thursday report that AWS could see a meaningful uplift from Anthropic by the fourth quarter, particularly if the startup begins pre-training its Claude 5 model during that time.
?
While Anthropic currently contributes about 100 basis points to AWS growth, Sandler projected that this could expand to as much as 400 basis points per quarter once both training and inference workloads scale.
?
Industry competition and strategic positioning
Anthropic is emerging as one of the strongest challengers to OpenAI, with Sandler highlighting the startup’s momentum in its application programming interface (API) business.
?
According to press reports cited by Barclays, Anthropic’s API business is now about double the size of OpenAI’s, bolstered by its paid services such as Claude Code, an AI-powered coding assistant.
?
Amazon’s close partnership ensures AWS remains central to Anthropic’s growth, though industry commentary has raised questions about performance issues tied to certain training projects, including T3 and Project Rainier.
?
Sandler suggested that as long as AWS retains Anthropic’s core training workloads, the cloud division should see accelerating growth.