Akamai Deploys Thousands of Blackwell GPUs: Edge Computing Giant Enters the AI Race
Akamai announces acquisition of thousands of NVIDIA Blackwell GPUs to build one of the world's most widely distributed AI platforms. The CDN giant is transforming from "content delivery" to "compute delivery.
Cambridge, MA — March 3, Akamai announced the acquisition of thousands of NVIDIA Blackwell GPUs to build one of the world's most widely distributed AI platforms. The edge computing giant known for CDN is undergoing a major strategic transformation.
From "Content Delivery" to "Compute Delivery"
Akamai might be unfamiliar to average users, but if you've watched videos, played games, or visited major websites, you've likely used its services — Akamai is the world's largest CDN provider, handling over a third of global internet traffic daily.
But over the past few years, the veteran tech company has been searching for new growth drivers.
"We realized that the future of edge computing isn't just about delivering content — it's about delivering compute," Akamai's CEO said at the press conference. "When AI becomes infrastructure, our positional advantage becomes clear."
Why Blackwell?
The answer: there was no choice.
In today's extremely scarce AI compute market, securing NVIDIA's latest GPUs is itself a capability. Akamai's acquisition of Blackwell GPUs is estimated to be worth hundreds of millions of dollars — for a company with annual revenue under $4 billion, this is a bold bet.
"Blackwell is currently the most powerful AI chip," a semiconductor analyst said. "Akamai needs cutting-edge technology to prove its value in the AI era."
Edge AI: The Overlooked Blue Ocean
Akamai's strategy isn't out of nowhere. Edge AI is becoming the next big opportunity.
Edge AI means running AI inference on edge nodes closer to users. Compared to traditional cloud AI, edge AI has three major advantages: lower latency, less bandwidth consumption, and better data privacy.
"Imagine when you tell your phone to order a cup of coffee," the analyst explained. "If AI needs to send your voice to a cloud server thousands of miles away and back, latency could be 500ms. But if AI runs on the nearest edge node, latency might be just 50ms."
For real-time AI applications — autonomous driving, smart manufacturing, AR/VR — edge AI is almost essential.
Akamai's Unique Advantage
Akamai has deployed over 400,000 servers globally, covering more than 140 countries. Most of these servers are at edge locations extremely close to users — carrier data centers, data centers, even base stations.
"That's our moat," the CEO said. "Building an edge network of this scale could take ten years and hundreds of billions of dollars."
By deploying AI compute on existing edge networks, Akamai can give users AI experiences close to local execution anywhere in the world.
Challenges and Opportunities
Of course, transformation isn't without challenges.
First, AI operations are completely different from traditional CDN operations. Akamai needs substantial AI talent to manage these GPU clusters.
Second, major cloud providers — AWS, Azure, Google Cloud — are also deploying edge AI. Akamai needs to prove it can compete.
Third, and most critically — will customers accept edge AI? Most enterprises still prefer cloud AI services.
"But the market is changing," the CEO said. "When latency becomes a bottleneck, customers will come to us."
Epilogue
Akamai's acquisition may mark a turning point for the edge computing industry.
When NVIDIA GPUs start appearing on hundreds of thousands of edge nodes globally, a new AI infrastructure landscape is forming. In this landscape, "old players" like Akamai might have advantages over cloud vendors.
After all, in the AI era, location is everything.
Reference: GlobeNewswire, The Verge, TechCrunch