How much electricity does AI training use?

How much electricity does AI training use?

Training a single large AI model consumes as much electricity as an average American home uses in 100 years.

Training advanced AI models like GPT-3 requires thousands of powerful processors running continuously for months. This intensive process consumes approximately 1,287 megawatt-hours of energy. Since a typical U.S. household uses about 10.6 megawatt-hours per year, a single training session equals more than a century of residential electricity consumption.
Nerd Mode
The energy demands of artificial intelligence stem from the massive computational scale of Large Language Models. A 2021 study by researchers at Google and the University of California, Berkeley, found that training GPT-3 required 1.287 gigawatt-hours of electricity—equivalent to the annual energy consumption of approximately 120 average American homes.The hardware used for these tasks consists of thousands of specialized chips, such as NVIDIA A100 GPUs, running 24 hours a day for weeks or months to process trillions of words. Data centers require constant cooling to prevent hardware from overheating, which significantly increases total power consumption.According to the U.S. Energy Information Administration, the average annual electricity consumption for a U.S. residential customer is about 10,632 kilowatt-hours. Multiplied by 100 years, this total is nearly identical to the energy footprint of a single GPT-3 training run. Larger models, such as GPT-4, are estimated to require even more energy.This high energy consumption has raised concerns about the carbon footprint of the tech industry. Researchers are now developing 'Green AI' approaches to create more efficient algorithms that reduce computational costs without sacrificing model performance.
Verified Fact FP-0003010 · Feb 17, 2026

- Technology -

energy consumption Green AI sustainability
Press Space for next fact