Finance News

Google’s decade-long bet on TPUs company’s secret weapon in AI race


Sopa Images | Lightrocket | Getty Images

Nvidia has established itself as the undisputed leader in artificial intelligence chips, selling large quantities of silicon to most of the world’s biggest tech companies en route to a $4.5 trillion market cap.

One of Nvidia’s key clients is Google, which has been loading up on the chipmaker’s graphics processing units, or GPUs, to try and keep pace with soaring demand for AI compute power in the cloud.

While there’s no sign that Google will be slowing its purchases of Nvidia GPUs, the internet giant is increasingly showing that it’s not just a buyer of high-powered silicon. It’s also a developer.

On Thursday, Google announced that its most powerful chip yet, called Ironwood, is being made widely available in the coming weeks. It’s the seventh generation of Google’s Tensor Processing Unit, or TPU, the company’s custom silicon that’s been in the works for more than a decade.

TPUs are application-specific integrated circuits, or ASICs, which play a crucial role in AI by providing highly specialized and efficient hardware for particular tasks. Google says Ironwood is designed to handle the heaviest AI workloads, from training large models to powering real-time chatbots and AI agents, and is more than four times faster than its predecessor. AI startup Anthropic plans to use up to 1 million of them to run its Claude model.

For Google, TPUs offer a competitive edge at a time when all the hyperscalers are rushing to build mammoth data centers, and AI processors can’t get manufactured fast enough to meet demand. Other cloud companies are taking a similar approach, but are well behind in their efforts.

Amazon Web Services made its first cloud AI chip, Inferentia, available to customers in 2019, followed by Trainium three years later. Microsoft didn’t announce its first custom AI chip, Maia, until the end of 2023.

“Of the ASIC players, Google’s the only one that’s really deployed this stuff in huge volumes,” said Stacy Rasgon, an analyst covering semiconductors at Bernstein. “For other big players, it takes a long time and a lot of effort and a lot of money. They’re the furthest along among the other hyperscalers.”

Google's AI chip 'Ironwood' takes on Nvidia

Originally trained for internal workloads, Google’s TPUs have been available to cloud customers since 2018. Of late, Nvidia has shown some level of concern. When OpenAI signed its first cloud contract with Google earlier this year, the announcement spurred Nvidia CEO Jensen Huang to initiate further talks with the AI startup and its CEO, Sam Altman, according to reporting by The Wall Street Journal.

Unlike Nvidia, Google isn’t selling its chips as hardware, but rather providing access to TPUs as a service through its cloud, which has emerged as one of the company’s big growth drivers. In its third-quarter earnings report last week, Google parent Alphabet said cloud revenue increased 34% from a year earlier to $15.15 billion, beating analyst estimates. The company ended the quarter with a business…



Read More: Google’s decade-long bet on TPUs company’s secret weapon in AI race

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More