
The accelerating demand for power driven by AI has become a dominant narrative in energy markets. However, there are several factors that may suggest reasons for scepticism.
03.12.2025 | 05:08 Uhr
Current International Energy Agency (IEA) forecasts now suggest that by 2030 the power needed to run both new and existing AI data centres could surpass 945 Terawatt-hours (TWh), more than the annual consumption of Japan.1 That sounds staggering and before we accept such projections, it’s worth remembering how often similar predictions have missed the mark.
In 1999, the US Coal industry claimed that IT would need half the nation’s electricity by 2020, so the US economy needed more coal. Similarly in 1999, Intel forecasted that putting “one billion PCs on the web represent an electrical demand equal to the total capacity of the U.S. today”.2It didn’t happen. Instead, the rise of the Internet reshaped consumption patterns with online commerce reducing the need for malls & decentralised warehouses. We also innovated. Between 2010 and 2018 global data centre compute increased by more than 550x yet energy use in data centres rose 6%.3 Today, the Internet accounts for less than 2% of total Western power use.4
So, will this time be different?
Reasons for Scepticism
Several factors suggest that AI’s long-term energy footprint may not grow as explosively as some expect.
1. Smarter, leaner AI models
Recent breakthroughs emerging from China have demonstrated that
some AI systems can achieve about 90% of the performance of large
proprietary models at roughly one-tenth the cost — and therefore with
far less energy. This is partly driven by their use of open source and
On-device AI. On-device AI refers to running artificial intelligence
processes directly on an end-user's device, such as a smartphone, rather
than in the cloud. In the West, leading AI companies are building their
models in separate cloud platforms. These walled gardens inevitably
create repetition while requiring more energy to transmit data between
edge devices and data centres. In contrast, On-device AI processes data
locally, eliminating the need for data transmission and therefore excess
energy consumption.
It is too soon to know which approach will ‘win’, however the economics
of these models may become hard to argue against for the majority of
users, especially those outside the US.
2. Declining marginal energy needs
Training a new AI model is energy-intensive, often requiring months of
computation. It’s the six months before a marathon. Yet once a model is
trained, the additional power needed to run it — and deliver inference —
can often be far smaller (the hours on the big day). It is not
unreasonable to think that as these systems mature and training cycles
stabilize, total power demand may grow much more slowly than early
projections assume. Indeed, a world-leading AI computing company
announced recently that their chips between 2016 and 2025 will have
become 10,000x more energy efficient. A continuation of that is
unlikely, but to put it in perspective, if cars improved at the same
rate they’d be producing 280,000 miles per gallon.
Elsewhere, innovation is already driving measurable improvements,
especially in areas such as cooling which is a significant power issue
for data centres. Historically, per single unit of server energy used,
data centres needed 1.5x that in cooling power. However, between 2007
and 2024 that figure fell to 0.6x and now some data centres are
achieving 0.1x or below5.
3. Can AI decrease energy usage?
Bill Gates claims that AI could reduce global energy demand due to
the efficiencies it unlocks in the 98% of energy use cases that occur
outside of data centres. As an example, a building energy management
system in Kansas cut energy usage by 16%, generating a two year payback
on the investment6. Buildings in some form represent nearly
40% of US energy usage, so if that result can be scaled, the investment
in AI begins to take on a different energy profile7.
Then there is the circular use case, AI working out how to cut the
energy profile of AI. One leading IT company used AI to reduce the
energy used to cool its data centres by about 40%, simply by predicting
the most efficient ways to regulate temperature and airflow. With
greater computing power, can come greater efficiencies.
Our View
In conclusion, while concerns about AI’s rising energy demands are
understandable, history and emerging evidence suggest that innovation
and efficiency will likely keep consumption in check. Advances in
hardware, smarter model design, and more effective cooling systems are
already mitigating much of the projected strain.
More importantly, AI itself is proving to be a powerful tool for reducing energy use across sectors. Rather than overwhelming the grid, AI may ultimately become one of the most effective means of managing and reducing global energy demand.
Whatever occurs, we feel some caution is warranted when considering the energy investments behind the AI trend.
1 IEA, Energy and AI Report, April 2025
2 “Dig more coal -- the PCs are coming.” Forbes, May 1999.
3 “Recalibrating global data center energy-use estimates.” Science.org, February 2020.
4 IEA, Data Centres and Data Transmission Networks, Janaury 2025.
5 Amory B. Lovins. (2025). Artificial Intelligence Meets Natural Stupidity: Managing the Risks.
6 Carol Ryan. “Energy-Guzzling AI Is Also the Future of Energy Savings.” Wall Street Journal. April 2024.
7 U.S. Energy Information Administration. Monthly Energy Review. October 2025.
Diesen Beitrag teilen: