Finance News

How the massive power draw of generative AI is overtaxing our grid


Thanks to the artificial intelligence boom, new data centers are springing up as quickly as companies can build them. This has translated into huge demand for power to run and cool the servers inside. Now concerns are mounting about whether the U.S. can generate enough electricity for the widespread adoption of AI, and whether our aging grid will be able to handle the load.

“If we don’t start thinking about this power problem differently now, we’re never going to see this dream we have,” said Dipti Vachani, head of automotive at Arm. The chip company’s low-power processors have become increasingly popular with hyperscalers like Google, Microsoft , Oracle and Amazon — precisely because they can reduce power use by up to 15% in data centers.

Nvidia‘s latest AI chip, Grace Blackwell, incorporates Arm-based CPUs it says can run generative AI models on 25 times less power than the previous generation.

“Saving every last bit of power is going to be a fundamentally different design than when you’re trying to maximize the performance,” Vachani said.

This strategy of reducing power use by improving compute efficiency, often referred to as “more work per watt,” is one answer to the AI ​​energy crisis. But it’s not nearly enough.

One ChatGPT query uses nearly 10 times as much energy as a typical Google search, according to a report by Goldman Sachs. Generating an AI image can use as much power as charging your smartphone

This problem isn’t new. Estimates in 2019 found training one large language model produced as much CO2 as the entire lifetime of five gas-powered cars

The hyperscalers building data centers to accommodate this massive power draw are also seeing emissions soar. Google’s latest environmental report showed greenhouse gas emissions rose nearly 50% from 2019 to 2023 in part because of data center energy consumption, although it also said its data centers are 1.8 times as energy efficient as a typical data center. Microsoft’s emissions rose nearly 30% from 2020 to 2024, also due in part to data centers. 

And in Kansas City, where Meta is building an AI-focused data center, power needs are so high that plans to close a coal-fired power plant are being put on hold.

Hundreds of ethernet cables connect server racks at a Vantage data center in Santa Clara, California, on July 8, 2024.

Katie Tarasov

Chasing power

There are more than 8,000 data centers globally, with the highest concentration in the U.S. And, thanks to AI, there will be far more by the end of the decade. Boston Consulting Group estimates demand for data centers will rise 15%-20% every year through 2030, when they’re expected to comprise 16% of total U.S. power consumption. That’s up from just 2.5% before OpenAI’s ChatGPT was released in 2022, and it’s equivalent to the power used by about two-thirds of the total homes in the U.S.

CNBC visited a data center in Silicon Valley to find out how the industry can handle this rapid growth, and where it will find enough power to make it…



Read More: How the massive power draw of generative AI is overtaxing our grid

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More