How Datacenters are Managing the AI Energy Binge
How can we manage the energy needs of AI? That’s a question that’s increasingly concerning users of the technology, particularly users of generative AI (GenAI). According to at least one source, by 2027 AI servers could gobble enough electricity annually to power the entire country of Argentina for a year. Even today, AI clusters are resource hogs: One study reported that training the GPT-3 model, for instance, consumed 1,287 megawatt hours of electricity and gave off 552 tons of carbon emissions.
That’s a big carbon footprint. And it’s not going to shrink anytime soon. The public cloud hyperscalers – AWS, Google, Microsoft, and Oracle – are running so-called AI factories comprising tens of thousands of GPUs – each sucking thousands of kilowatts (kWs) and putting out tons of carbon emissions. One NVIDIA H100 GPU alone consumes 700 watts – more than the average U.S. household. Multiply each GPU by thousands, and the numbers quickly add up.
The situation has led to chip design company Arm (Nasdaq: ARM) to contribute with parent SoftBank $25 million to a $110 million joint U.S.-Japan research project to alleviate the AI power drain. “[B]y the end of the decade, AI data centers could consume as much as 20% to 25% of U.S. power requirements. Today that’s probably 4% or less,” Arm CEO Rene Haas told the Wall Street Journal.
Evidence of the power draw by AI can be seen in the surge in stocks among companies supplying power to the U.S., where most of the world’s AI training is presently located. Among these are a group of Korean firms supplying the U.S. market, such as HD Hyundai Electric Co., Hyosung Heavy Industries Corp., and LS Electric Co., among others. These firms are making hay by supplying AI datacenters with power-generating equipment that replaces aging U.S. infrastructure.
Addressing the Power Problem
To access the rest of this article, you need a Futuriom CLOUD TRACKER PRO subscription — see below.
Access CLOUD TRACKER PRO
|
CLOUD TRACKER PRO Subscribers — Sign In |