Devoured - April 30, 2026
Google to sell TPU chips to 'select' customers in latest shot at Nvidia (2 minute read)

Google to sell TPU chips to 'select' customers in latest shot at Nvidia (2 minute read)

AI Read original

Google is shifting from renting cloud TPU access to selling its custom AI chips directly to select customers for their own data centers, intensifying competition with Nvidia.

What: Alphabet announced it will sell Tensor Processing Units (TPUs) to select customers who can install them in their own facilities, rather than only offering TPU capacity through Google Cloud. The company recently launched TPU 8t for training and TPU 8i for inference, with deals already signed with Anthropic and Meta.
Why it matters: This signals a major strategic shift in the AI chip market, with cloud providers moving beyond their traditional rental models to compete directly with Nvidia in hardware sales. Amazon is pursuing a similar strategy with its own chips, collectively representing a multi-billion dollar challenge to Nvidia's market dominance.
Takeaway: If you're planning large-scale AI infrastructure, compare Google TPUs, Amazon's Trainium/Graviton, and Nvidia GPUs for your specific training and inference workloads, as vendor lock-in implications differ between cloud-only and owned hardware.
Decoder
  • TPU: Tensor Processing Unit, Google's custom-designed chips optimized specifically for machine learning workloads
  • Inferencing: Running trained AI models to make predictions, as opposed to training which creates the models
  • Gigawatt agreement: Energy capacity commitment for data center chip deployments (1 gigawatt powers roughly 700,000 homes)
Original article

Google to sell TPU chips to 'select' customers in latest shot at Nvidia

Google parent Alphabet (GOOG, GOOGL) on Wednesday said that it plans to sell its custom Tensor Processing Units (TPUs) to select customers who will install the chips in their own data centers.

The move is a change from Google's prior strategy, which saw it rent out TPU capacity to customers from its own data centers — and is yet another strike at AI chip king Nvidia (NVDA).

The announcement, during the company's Q1 earnings call, comes a week after Alphabet announced two new TPUs: its TPU 8t for AI training and TPU 8i for inferencing.

"As TPU demand grows from AI labs, capital markets firms, and high-performance computing applications, we'll begin to deliver TPUs to a select group of customers in their own data centers in a hardware configuration to expand our addressable market opportunity," Alphabet CEO Sundar Pichai said during the company's first quarter earnings call.

Alphabet didn't disclose potential customers, but it signed a multiple-gigawatt agreement for next-generation TPUs with Anthropic (ANTH.PVT) earlier this month, with chips expected to begin coming online in 2027.

And according to The Information, Alphabet has also entered into a multibillion-dollar chip deal with Meta (META).

Alphabet's TPU maneuvers put it into ever greater competition with Nvidia, which has largely dismissed any fears that Alphabet's offerings will erode its lead in the space, saying that its chips offer greater flexibility for AI developers.

Google isn't the only company moving in on Nvidia's turf. Amazon (AMZN) is also offering up its own chips to customers.

In his annual shareholder letter, Amazon CEO Andy Jassy said that the company's chip business, which includes its Graviton, Trainium, and Nitro processors, has an annual revenue run rate of greater than $20 billion.

But because Amazon only monetizes its chips through its AWS EC2 (Elastic Compute Cloud) service, the CEO explained that $20 billion is likely an understatement and that it would probably be closer to $50 billion.

Like Google, Amazon signed a new agreement for 5 gigawatts of AI chip capacity with Anthropic, but also inked a deal for 2 gigawatts of chips with OpenAI.

On the CPU side, Amazon said it will deploy its AWS Graviton chips for Meta (META) to use across its agentic AI workloads.