AI Energy Consumption Overview
The energy consumption of artificial intelligence (AI) is a rapidly growing concern due to its increasing deployment across various sectors. AI's energy consumption is primarily driven by the computational requirements of its models, particularly those used for generative tasks like text, image, and video generation.
Current Energy Consumption Estimates
Estimates suggest that AI's energy consumption is substantial and growing. For instance, interactions with AIs like ChatGPT could consume 10 times more electricity than a standard Google search.[1] The International Energy Agency (IEA) projects that data centers' electricity consumption in 2026 will be double that of 2022, reaching 1,000 terawatts, roughly equivalent to Japan's total consumption.[2]
Breakdown of AI Energy Consumption
A significant portion of AI's energy consumption is attributed to the inference phase, where trained models are used to generate outputs. Recent data from Meta and Google indicate that inference accounts for 60-70% of energy consumption, compared to 20-40% for training.[3] The energy required for a single AI query can vary widely, depending on the model size, type of output, and other factors. For example, generating a standard-quality image with Stable Diffusion3 Medium requires about 1,141 joules of GPU energy, while generating a 5-second video with CogVideoX can require up to 3.4 million joules.[4]
Future Projections
Projections indicate that AI's energy consumption will continue to grow. By 2028, AI-specific servers in US data centers are estimated to consume between 165 and 326 terawatt-hours of electricity per year, enough to power 22% of US households.[5] This growth is driven by the increasing adoption of AI models, particularly those used for generative tasks.
Environmental Impact
The environmental impact of AI's energy consumption is significant, with estimates suggesting that training the BLOOM AI model emits around 50 tonnes of greenhouse gases, or 10 times more than the annual emissions of a French person.[6] The carbon intensity of electricity used by data centers is also a concern, with one study finding it to be 48% higher than the US average.[7]
Conclusion
The energy consumption of AI is substantial and growing, driven by the increasing adoption of generative models and the computational requirements of their inference phases. Estimates suggest that AI's energy consumption could reach levels equivalent to powering a significant portion of US households by 2028.
Authoritative Sources
- Generative AI: energy consumption soars. [Polytechnique Insights]↩
- Artificial intelligence climate energy emissions. [Yale e360]↩
- Generative AI: energy consumption soars. [Polytechnique Insights]↩
- We did the math on AI’s energy footprint. Here’s the story you haven’t heard. [MIT Technology Review]↩
- We did the math on AI’s energy footprint. Here’s the story you haven’t heard. [MIT Technology Review]↩
- Generative AI: energy consumption soars. [Polytechnique Insights]↩
- We did the math on AI’s energy footprint. Here’s the story you haven’t heard. [MIT Technology Review]↩

Answer Provided by iAsk.ai – Ask AI.
Sign up for free to save this answer and access it later
Sign up →