Anthropic Signs $21 Billion Chip Deal with Broadcom: Securing Nearly 1 Million Google TPU v7p Units

Anthropic commits $21 billion to chip orders via Broadcom, securing nearly 1 million Google TPU v7p units for delivery by end of 2026. This massive deal highlights AI startups' strategic positioning in the computing power race.

Anthropic Signs $21 Billion Chip Deal with Broadcom: Securing Nearly 1 Million Google TPU v7p Units

Silicon Valley — AI startup Anthropic has struck a blockbuster deal with chip design giant Broadcom, committing $21 billion to custom chip orders and securing nearly 1 million Google TPU v7p units for delivery by the end of 2026. This massive agreement highlights strategic positioning in the computing power race among AI companies.

$21 Billion: A Premium Computing Investment

Under the agreement terms, Anthropic has confirmed a $21 billion commitment to Broadcom for chip procurement, with $10 billion confirmed and the remaining $11 billion expected to be delivered by the end of 2026.

These chips will be delivered as fully assembled "rack-level AI systems," ready for deployment directly in Anthropic's data centers. Notably, these chips are based on Google's designed Tensor Processing Units (TPU) version v7p, manufactured by Broadcom.

Broadcom: A Key Player in AI Supply Chain

Anthropic's order further underscores Broadcom's growing prominence in the AI supply chain. While NVIDIA dominates the AI chip sector, Broadcom has found its niche through the custom chip route.

Previously, Broadcom just announced it expects AI chip sales to exceed $100 billion in 2027. Anthropic's order certainly adds strong support to this projection.

Meanwhile, Google reportedly cut its 2026 TPU production target from 4 million to 3 million units due to limited advanced packaging capacity. This means TPU resources remain scarce in the market, making Anthropic's early锁定 (securing) of massive chips particularly prudent.

AI Computing Competition Intensifies

From Meta's custom chips to Anthropic's TPU locking, to OpenAI's continuously iterating models, tech giants and AI startups are fighting desperately for computing resources. This reflects the explosion in computing demand driven by rapid AI technology development.

For Anthropic, securing abundant computing power is not only necessary to support current model operations but also to maintain technological leadership in the intense AI competition. The Claude series has been a major competitor to OpenAI's GPT series, and computing resources directly determine model training and inference capabilities.

*参考来源:RCRTechTMTPOSTAIBase