Why does AI use so much electricity?
19/07/25 09:51
Makonin, Li, MacCormack and I working on developing a tiny generative imaging model. We have surveyed the AI engineering literature pretty extensively. AI's electricity consumption is mainly due to:
· the use of accelerators, mainly graphics processing units or GPUs, also used for the Internet of Things, virtual reality, cloud gaming, and blockchain
· 5G. Telecoms around the world are building new data centers to support fifth-generation or 5G computing, which relies on core and edge network servers, essentially moving computational tasks closer to the user end
· Training ML, especially large language models, requires repetitions of calculations using billions of parameters. Training a single AI model can emit as much carbon as five cars in their lifetimes. This is pretty well known, but also
· Individual inferences, or uses, of ML use much less computation, less electricity, but can be millions or billions of inferences per day. e.g. Google Translate, Chat GPT, or Google’s switch to AI searches. People use ML apps for tasks that a calculator or search engine (or own brain) could do, at much higher electricity cost. Great article on this by Sasha Luccioni and colleagues
· the use of accelerators, mainly graphics processing units or GPUs, also used for the Internet of Things, virtual reality, cloud gaming, and blockchain
· 5G. Telecoms around the world are building new data centers to support fifth-generation or 5G computing, which relies on core and edge network servers, essentially moving computational tasks closer to the user end
· Training ML, especially large language models, requires repetitions of calculations using billions of parameters. Training a single AI model can emit as much carbon as five cars in their lifetimes. This is pretty well known, but also
· Individual inferences, or uses, of ML use much less computation, less electricity, but can be millions or billions of inferences per day. e.g. Google Translate, Chat GPT, or Google’s switch to AI searches. People use ML apps for tasks that a calculator or search engine (or own brain) could do, at much higher electricity cost. Great article on this by Sasha Luccioni and colleagues
