326
Audio & Video Production324
Automation & Workflow220
Software Development245
Marketing & Growth201
AI Infrastructure & MLOps150
Writing & Content Creation194
Data & Analytics131
Customer Support128
Design & Creative151
Sales & Outreach119
Operations & Admin97
Photography & Imaging139
Voice & Speech128
Education & Learning116
Anthropic signed a new agreement to get more Google Cloud TPU capacity for Claude, with new compute expected to come online in 2027.
In short: Anthropic says it has signed a new agreement with Google and Broadcom to get more computing capacity to run its Claude AI models.
Anthropic, the company behind the Claude chatbot, announced a new agreement with Google and Broadcom for more processing power. This capacity will be used to train and run its AI models.
The deal expands Anthropic’s use of Google Cloud TPUs. TPUs are specialized computer chips made for AI work, like having a factory line built for one kind of product instead of a general workshop.
Anthropic said the new capacity will come online in 2027. The company did not share exact details, but TechCrunch cited a Broadcom filing with U.S. regulators that suggests the agreement includes 3.5 gigawatts of compute capacity. A gigawatt is a way to describe the scale of electricity and hardware involved, similar to talking about how big a power plant is.
Anthropic also said most of this capacity will be located in the United States. The company framed it as part of its earlier commitment to invest $50 billion in U.S. compute infrastructure, meaning the data centers and equipment needed to run AI.
TechCrunch also reported that Anthropic’s demand has grown fast. The company said its run-rate revenue is now $30 billion, up from $9 billion at the end of 2025, and it has more than 1,000 business customers spending over $1 million a year.
AI tools like Claude need large amounts of computing power to keep working quickly as more people and companies use them. Deals like this can affect how available, fast, and expensive these tools are for customers over the next few years.
Source: TechCrunch AI