Our digital lives are powered by energy-hungry infrastructure running around the clock. Computer scientist Professor Mark Gahegan answers five questions about our exploding demand for data.

AI is leaving a significant footprint on the planet. Photo: Ella Don.
AI is leaving a significant footprint on the planet. Photo: Ella Don.

By Jamie Morton

Does our use of data and AI leave a footprint on the planet?

Yes, and it’s surprisingly large – though not always for the reasons we may think.

Most people imagine the environmental cost of AI sits in individual actions: asking ChatGPT a question, streaming a video, or storing photos in the cloud.

In reality, the footprint mostly comes from what sits behind the scenes: data centres, where vast amounts of computing power run around the clock.

As University of Auckland computer scientist Professor Mark Gahegan puts it, data centres aren’t just about data; they’re about computation.

“Data centres are a place where the computers are, and because you often need the computers to be next to the data, that’s also where the data lives.”

Those computers draw large amounts of power and need constant cooling to stop overheating. Globally, data centres already account for 0.5-1 percent of energy-related emissions and are one of the fastest-growing sources of electricity demand.

AI adds to this pressure, Gahegan says, but not every AI use is equal. In terms of energy-use, the AI services we use are incredibly expensive to train, but they are often quite cheap to run.

Also contrary to some popular notions, a single AI query uses only a small amount of power.

OpenAI’s Sam Altman has stated that an average ChatGPT query uses about 0.34 watt-hours of energy: roughly what an oven would use in a little over a second, or what a high-efficiency lightbulb would consume in a couple of minutes.

But the real cost, says Gahegan, sits in training the huge models that support AI chatbots, and in scaling them up to billions of uses. For example, GPT-4 used between 52 million kilowatt-hours and 62 million kilowatt-hours of electricity during its initial training phase.

How is this likely to change?

“Since the 1990s, supercomputing power has grown exponentially,” says Gahegan.

Almost every field is becoming more data-intensive, from medicine and genomics to climate science, video, gaming and business analytics.

Better cooling, more efficient chip design and smarter software can help, but they don’t cancel out growth.

Recent estimates suggest the energy and emissions footprint of data centres – particularly those supporting AI – is growing faster than earlier projections assumed.

The International Energy Agency estimates global data centre electricity consumption reached about 415 terawatt-hours in 2024 – 1.5 percent of global electricity use – and is rising at 12 percent a year.

Industry analysts point to even steeper near-term growth, with US research and advisory firm Gartner projecting 448 terawatt-hours in 2025 – a 16 percent annual increase – and S&P Global forecasting a 22 percent jump in grid power demand driven largely by AI workloads.

And though that 1.5 percent global share may still sound small, data centres are among the few sectors – alongside road transport and aviation – in which emissions are expected to rise rather than fall this decade.

At this point, Gahegan says, there is no realistic path away from large data centres.

“I don’t think we’ll be moving away from them any time soon,” Gahegan says. “The need for that physical infrastructure is pretty hard to get around in a modern, digital economy.

Keeping devices for longer is one way to reduce their environmental impact. Photo: Sigmund Fa.
Keeping devices for longer is one way to reduce their environmental impact. Photo: Sigmund Fa.

Can AI help solve its own climate problem?

Yes, Gahegan says, but only if it’s deployed carefully and at scale.

Although AI uses an extraordinary amount of energy – ChatGPT reportedly consumes over half a million kilowatt-hours of electricity daily – it can also help reduce emissions elsewhere by making systems more efficient.

Gahegan gives a striking example from climate science. Traditional weather-forecasting models can take hours to run on supercomputers. In 2024, Microsoft launched its AI-based climate forecasting system Aurora which, after a very energy-intensive training phase, can produce forecasts in one minute.

“There’s a big upfront [energy] cost, but once the model exists, it’s much cheaper to run than the traditional high-performance computing used in science.”

International energy analysis suggests that AI could also help cut emissions by detecting methane leaks in oil and gas systems, optimising power plants, augmenting time-poor medical specialists, reducing energy use in buildings and improving transport efficiency.

Interestingly, AI may also soon be able to design more efficient AI systems, reducing the cost and the power consumption. Current research also shows that smaller, more specialised AI agents can usually be more effective, and more energy efficient.

In some scenarios, these savings could outweigh the emissions from data centres themselves.

“But that depends entirely on how AI is rolled out,” says Gahegan, and on whether efficiency gains are swallowed up by even more growth in uptake.

Where does NZ fit into all this?

Mostly at the edge – and that matters.

New Zealand uses large amounts of cloud computing, but much of the physical infrastructure is overseas, especially in Australia and other regions with more fossil-fuel-heavy electricity grids.

“Asking ChatGPT a question in New Zealand means relying on overseas data centres,” says Gahegan.

“That means that the electricity use is happening somewhere else.”

That creates an ethical tension. New Zealand benefits from digital services while much of the environmental cost is borne offshore.

Even when companies promise “carbon-free” data centres, that electricity is still being diverted from other uses, such as electrifying transport.

At the same time, very large data centres are often more energy-efficient than smaller, local setups.

“Moving things to big data centres usually means good news for energy efficiency,” says Gahegan, even if it’s uncomfortable to accept.

“Modern, liquid-cooled data centres are the most energy efficient way to deal with the demand, for now.”

So what can people do themselves?

More than you may think, but not where you expect.

Gahegan stresses that home users are not the biggest drivers of the problem. Still, he argues it’s wrong to say individuals are powerless.

The biggest impact often comes before you even turn a device on.

“People tend to buy the most powerful computer they can,” he says.

“Looking at the energy draw of the thing you’re going to use every day is probably the most important contribution you can make.”

His practical advice: buy minimum-spec devices, keep devices longer and avoid unnecessary upgrades, turn them off or onto standby when not in use, and pay attention to recycling and the take-back schemes from manufacturers.

Businesses and corporate consumers can insist that their computing equipment is carefully recycled by their suppliers, he says, and so they should.

Counter-intuitively, running heavy computing tasks on a laptop is often worse than using a well-managed data centre.

“It’s much more energy-efficient to run larger compute workloads in a professionally managed data centre, than on your personal machine.”

He also emphasises better data habits. Not all data needs to be stored for ever.

Distilling what matters, archiving what’s rarely used, and deleting what has no value can significantly cut costs and energy use, especially for institutions.

• The world is facing unprecedented environmental challenges. Planetary Solutions, an initiative of the Sustainability Hub at Waipapa Taumata Rau, University of Auckland, and Newsroom, explores these issues — and the practical ways we can all be part of the solution.

This story was first published on Planetary Solutions on Newsroom on 17 February 2026.

Media contact

Rose Davis | Research communications adviser
M
: 027 568 2715
E: rose.davis@auckland.ac.nz