Which uses more electricity, AI or crypto?

Q. Which uses more electricity, AI or crypto? What about online video?
 
A. They both use a lot! AI uses more!

Researchers at the International Energy Agency computed the electricity required to train the largest AI model (one with a training compute of 1023 FLOP) at EpochAI to be 310 gigawatt-hours. On that basis they estimated the total training electricity consumption of all AI models from 2020 to 2024 to be 1700 TWh (0.1% of the global electricity consumption of data centers over this period). That works out to 425 TWh of electricity annually.

(They admit the estimate is rough, but within an order of magnitude.) (International Energy Agency, “Energy and AI,” 2025.
https://www.iea.org/reports/energy-and-ai/)

That's just the training. I would at least double this figure to account for inference (individual use), and add another 10-20% for chip manufacture. Then I'd calculate the contribution of high-bandwidth networks and user-end devices.

Cryptocurrency is “mined” using different equipment from the data centers that power most other uses, which makes it relatively simple to measure its electricity consumption. The Cambridge Bitcoin Electricity Consumption Index currently estimates Bitcoin mining to consume about
187 TWh of electricity annually.

(
That figure is annualized consumption based on crypto mining on July 30, 2025. It falls between a lower bound of 95.25 TWh and an upper bound of 418.55 TWh.) https://ccaf.io/cbnsi/cbeci

I believe online video uses still more! As with AI, it's difficult to calculate separately from the total electricity for manufacturing and use of data centres, networks, and devices.

Tiny language models and tiny image generation

Tiny language models and tiny image generation
Read More…

Why does AI use so much electricity?

Makonin, Li, MacCormack and I working on developing a tiny generative imaging model. We have surveyed the AI engineering literature pretty extensively. AI's electricity consumption is mainly due to:
·  the use of accelerators, mainly graphics processing units or GPUs, also used for the Internet of Things, virtual reality, cloud gaming, and blockchain
·
      5G. Telecoms around the world are building new data centers to support fifth-generation or 5G computing, which relies on core and edge network servers, essentially moving computational tasks closer to the user end
·
      Training ML, especially large language models, requires repetitions of calculations using billions of parameters. Training a single AI model can emit as much carbon as five cars in their lifetimes. This is pretty well known, but also
·
      Individual inferences, or uses, of ML use much less computation, less electricity, but can be millions or billions of inferences per day. e.g. Google Translate, Chat GPT, or Google’s switch to AI searches. People use ML apps for tasks that a calculator or search engine (or own brain) could do, at much higher electricity cost. Great article on this by Sasha Luccioni and colleagues

Environmental impact of machine learning

Mitigating the Environmental Impact of Machine Learning is my new research team with SFU computer scientist Stephen Makonin (collaborator on Tackling the Caarbon Footprint of Streaming Media), Makonin’s graduate student Kehui Li, and AI artist and SCA PhD student Jess MacCormack. We are researching the environmental impact of machine learning, aka artificial intelligence, and ways to mitigate it, such as by developing models that use much less electricity. Funded by the SSHRC Insight Grant of Arne Eigenfeldt (PI), Jim Bizzocchi, and me, Small-File Generative Art
Read More…

low-impact music streaming

An article on low-impact music streaming, quoting me, in the MIT Technology Review:
https://www.technologyreview.com/2024/07/17/1095024/music-streaming-climate-friendly-tips/
RapidWeaver Icon

Made in RapidWeaver