Which uses more electricity, AI or crypto?

Q. Which uses more electricity, AI or crypto? What about online video?
 
A. They both use a lot! AI uses more!

Researchers at the International Energy Agency computed the electricity required to train the largest AI model (one with a training compute of 1023 FLOP) at EpochAI to be 310 gigawatt-hours. On that basis they estimated the total training electricity consumption of all AI models from 2020 to 2024 to be 1700 TWh (0.1% of the global electricity consumption of data centers over this period). That works out to 425 TWh of electricity annually.

(They admit the estimate is rough, but within an order of magnitude.) (International Energy Agency, “Energy and AI,” 2025.
https://www.iea.org/reports/energy-and-ai/)

That's just the training. I would at least double this figure to account for inference (individual use), and add another 10-20% for chip manufacture. Then I'd calculate the contribution of high-bandwidth networks and user-end devices.

Cryptocurrency is “mined” using different equipment from the data centers that power most other uses, which makes it relatively simple to measure its electricity consumption. The Cambridge Bitcoin Electricity Consumption Index currently estimates Bitcoin mining to consume about
187 TWh of electricity annually.

(
That figure is annualized consumption based on crypto mining on July 30, 2025. It falls between a lower bound of 95.25 TWh and an upper bound of 418.55 TWh.) https://ccaf.io/cbnsi/cbeci

I believe online video uses still more! As with AI, it's difficult to calculate separately from the total electricity for manufacturing and use of data centres, networks, and devices.
RapidWeaver Icon

Made in RapidWeaver