Recommended Sponsor Painted-Moon.com - Buy Original Artwork Directly from the Artist

Source: The Conversation (Au and NZ) – By Richard Morris, Postdoctoral Fellow, Faculty of Agriculture and Life Sciences, Lincoln University, New Zealand

Serene Lee/Getty Images

Cut the words “please” and “thank you” from your next ChatGPT query and, if you believe some of the talk online, you might think you are helping save the planet.

The idea sounds plausible because AI systems process text incrementally: longer prompts require slightly more computation and therefore use more energy. OpenAI’s chief executive Sam Altman has acknowledged it all adds to operating costs at the scale of billions of prompts.

At the same time, it is a stretch to suggest that treating ChatGPT politely comes at significant environmental cost. The effect of a few extra words is negligible compared with the energy required to operate the underlying data centre infrastructure.

What is more important, perhaps, is the persistence of the idea. It suggests that many people already sense AI is not as immaterial as it appears. That instinct is worth taking seriously.

Artificial intelligence depends on large data centres built around high-density computing infrastructure. These facilities draw substantial electricity, require continuous cooling, and are embedded in wider systems of energy supply, water and land use.

As AI use expands, so does this underlying footprint. The environmental question, then, is not how individual prompts are phrased, but how frequently and intensively these systems are used.

Why every AI query carries an energy cost

One structural difference between AI and most familiar digital services helps explain why this matters.

When a document is opened or a stored video is streamed, the main energy cost has already been incurred. The system is largely retrieving existing data.

By contrast, each time an AI model is queried it must perform a fresh computation to generate a response. In technical terms, each prompt triggers a fresh “inference” – a full computational pass through the model – and that energy cost is incurred every time.

This is why AI behaves less like conventional software and more like infrastructure. Use translates directly into energy demand.

The scale of that demand is no longer marginal. Research published in the journal Science estimates that data centres already account for a significant share of global electricity consumption, with demand rising rapidly as AI workloads grow.

The International Energy Agency has warned that electricity demand from data centres could double by the end of the decade under current growth trajectories.

Electricity is only one part of the picture. Data centres also require large volumes of water for cooling, and their construction and operation involve land, materials and long-lived assets. These impacts are experienced locally, even when the services provided are global.

AI’s hidden environmental footprint

New Zealand offers a clear illustration. Its high share of renewable electricity makes it attractive to data centre operators, but this does not make new demand impact-free.

Large data centres can place significant pressure on local grids and claims of renewable supply do not always correspond to new generation being added. Electricity used to run servers is electricity not available for other uses, particularly in dry years when hydro generation is constrained.

Viewed through a systems lens, AI introduces a new metabolic load into regions already under strain from climate change, population growth and competing resource demands.

Energy, water, land and infrastructure are tightly coupled. Changes in one part of the system propagate through the rest.

This matters for climate adaptation and long-term planning. Much adaptation work focuses on land and infrastructure: managing flood risk, protecting water quality, maintaining reliable energy supply and designing resilient settlements.

Yet AI infrastructure is often planned and assessed separately, as if it were merely a digital service rather than a persistent physical presence with ongoing resource demands.

Why the myth matters

From a systems perspective, new pressures do not simply accumulate. They can drive reorganisation.

In some cases, that reorganisation produces more coherent and resilient arrangements; in others, it amplifies existing vulnerabilities. Which outcome prevails depends largely on whether the pressure is recognised early and incorporated into system design or allowed to build unchecked.

This is where discussion of AI’s environmental footprint needs to mature. Focusing on small behavioural tweaks, such as how prompts are phrased, distracts from the real structural issues.

The more consequential questions concern how AI infrastructure is integrated into energy planning, how its water use is managed, how its location interacts with land-use priorities, and how its demand competes with other social needs.

None of this implies that AI should be rejected. AI already delivers value across research, health, logistics and many other domains.

But, like any infrastructure, it carries costs as well as benefits. Treating AI as immaterial software obscures those costs. Treating it as part of the physical systems we already manage brings them into view.

The popularity of the “please” myth is therefore less a mistake than a signal. People sense AI has a footprint, even if the language to describe it is still emerging.

Taking that signal seriously opens the door to a more grounded conversation about how AI fits into landscapes, energy systems and societies already navigating the limits of adaptation.

Richard Morris is the co-founder of Kirini Ltd, a nature-based solutions consultancy. He receives funding from Lincoln University.

ref. Does adding ‘please’ and ‘thank you’ to your ChatGPT prompts really waste energy? – https://theconversation.com/does-adding-please-and-thank-you-to-your-chatgpt-prompts-really-waste-energy-272258

NO COMMENTS