We still don’t know how much energy AI consumes

Stay informed with free updates

The writer is a research scientist and climate lead at open-source AI platform Hugging Face

With every query, image generation and chatbot conversation, the energy that is being consumed by artificial intelligence models is rising. Already, emissions by data centres needed to train and deliver AI services are estimated at around 3 per cent of the global total, close to those created by the aviation industry.

But not all AI models use the same amount of energy. Task-specific AI models like Intel’s TinyBERT and Hugging Face’s DistilBERT, which simply retrieve answers from text, consume minuscule amounts of energy — about 0.06 watt-hours per 1,000 queries. This is equivalent to running an LED bulb for 20 seconds. 

At the other extreme, large language models such as OpenAI’s GPT-4, Anthropic’s Claude, Meta’s Llama, DeepSeek, or Alibaba’s Qwen use thousands of times more energy for the same query. The result is like turning on stadium floodlights to look for your keys. 

Why is there such an enormous difference in energy consumption? Because LLMs don’t just find answers, they generate them from scratch by recombining patterns from massive data sets. This requires more time, compute and energy than an internet search.  

Measuring precisely how big each AI model is and how much energy it is using is difficult. Companies with closed-source systems, like Google’s Gemini or Anthropic’s Claude, do not make their code publicly available and are protective of this information. That is why the internet is full of unverified claims about the quantities of energy and water that chatbot queries require, and how this compares with an internet search. 

The AI Energy Score project, a collaboration between Salesforce, Hugging Face, AI developer Cohere and Carnegie Mellon University, is an attempt to shed more light on the issue by developing a standardised approach. The code is open and available for anyone to access and contribute to. The goal is to encourage the AI community to test as many models as possible.

By examining 10 popular tasks (such as text generation or audio transcription) on open-source AI models, it is possible to isolate the amount of energy consumed by the computer hardware that runs them. These are assigned scores ranging between one and five stars based on their relative efficiency. Between the most and least efficient AI models in our sample, we found a 62,000-fold difference in the power required. 

Since the project was launched in February a new tool compares the energy use of chatbot queries with everyday activities like phone charging or driving as a way to help users understand the environmental impacts of the tech they use daily.

The tech sector is aware that AI emissions put its climate commitments in danger. Both Microsoft and Google no longer seem to be meeting their net zero targets. So far, however, no Big Tech company has agreed to use the methodology to test its own AI models.

It is possible that AI models will one day help in the fight against climate change. AI systems pioneered by companies like DeepMind are already designing next-generation solar panels and battery materials, optimising power grid distribution and reducing the carbon intensity of cement production.

Tech companies are moving towards cleaner energy sources too. Microsoft is investing in the Three Mile Island nuclear power plant and Alphabet is engaging with more experimental approaches such as small modular nuclear reactors. In 2024, the technology sector contributed to 92 per cent of new clean energy purchases in the US. 

But greater clarity is needed. OpenAI, Anthropic and other tech companies should start disclosing the energy consumption of their models. If they resist, then we need legislation that would make such disclosures mandatory.

As more users interact with AI systems, they should be given the tools to understand how much energy each request consumes. Knowing this might make them more careful about using AI for superfluous tasks like looking up a nation’s capital. Increased transparency would also be an incentive for companies developing AI-powered services to select smaller, more sustainable models that meet their specific needs, rather than defaulting to the largest, most energy-intensive options.

AI represents one of the biggest technological breakthroughs of our time. It will revolutionise our lives. But the technology comes with environmental costs that should be made clear to users and policymakers alike. In this era of climate crisis, making the energy use of AI more transparent is essential.

Leave a Comment