Prompt Playbook: Big Questions in AI PART 4

Prompt Playbook: Big Questions in AI

Hey Prompt Entrepreneur,

The environmental impact question always comes up. And I mean always. Whether it's during a keynote, in client meetings, or casual conversations about AI, it never fails.

Once I'd barely finished my talk when a hand shot up: "But what about all the water AI uses? I read that each ChatGPT query guzzles a whole bottle of water!"

I had seen then shaking their head and squinting at me the whole talk so knew something was coming!

The room nodded knowingly. This tidbit had clearly made the rounds—another AI horror story to add to the collection. I took a deep breath, knowing I was about to burst a rather popular bubble.

"Actually," I said, "the water story is mostly a red herring. The real environmental challenge with AI is electricity, not water. And even that needs context."

The puzzled looks told me this was going to be one of those conversations. Perfect. This is exactly the kind of nuanced discussion we need to have about AI's environmental impact.

Let’s get started:

Summary

Is AI bad for the environment?

  • The "bottle of water per query" claim

  • The real issue: electricity consumption

  • How AI's energy use compares to other industries and everyday activities

  • Industry responses including nuclear power and efficiency improvements

  • Practical steps businesses can take to minimise their AI environmental footprint

The Three Camps

When it comes to AI's environmental impact, I typically encounter three distinct perspectives:

Position 1: "AI Is an Environmental Catastrophe"

The environmentally alarmed view sees AI as an ecological disaster in the making:

"Every ChatGPT query consumes a bottle of water! Data centres will soon use as much electricity as entire countries! We're facing a climate emergency—we can't afford this frivolous technology burning through resources just so people can generate cat memes and crappy poetry."

This perspective often cites dramatic statistics about water usage and exponential growth in energy consumption, painting AI as an unsustainable luxury we can't afford in a warming world.

Position 2: "Environmental Concerns Are Overblown"

At the opposite extreme, the techno-optimist view dismisses environmental worries entirely:

"AI efficiency improves faster than consumption grows. Besides, AI will solve far more environmental problems than it creates—from optimising power grids to accelerating renewable energy research. Market forces ensure efficiency improvements happen automatically."

This camp sees environmental concerns as either exaggerated or soon to be solved by the very technology being criticised.

Basically we’ll be able to outrun the problems we create using the very same AI. A very common human point of view! We’ll fix it later don’t worry!

Position 3: "Real Concerns, But Context Matters"

My position occupies a more nuanced middle ground.

AI's environmental impact is real and growing, but it needs to be understood accurately and in context. We have to look at the actual facts and figures (gross!) rather than just our opinions and what we want to be true.

The Water Myth

First let’s tackle that water bottle claim head-on. It gets in the way of actually discussing the issue. The idea that each ChatGPT query uses a bottle of water has become viral, but it fundamentally misunderstands how modern data centre cooling works.

When training models a lot of GPUs are used. And they get hot. Real hot.

To keep them running efficiently they need to be cooled down. Cold water is used to do so.

Yes, some older data centres use evaporative cooling, where water evaporates to remove heat. But the industry is rapidly shifting to closed-loop systems where water circulates without evaporation. It basically is cooled, run through the servers to cool off the GPUs and the warm/hot water returned to a reservoir to be cooled.

Microsoft for example announced that all new data centre designs from August 2024 will use zero-water evaporation cooling technology, eliminating cooling-related water consumption entirely.

The fixation on water consumption is understandable—it's easy to visualise "a bottle of water per query." It’s dramatic. And scary sounding. But it's increasingly outdated as cooling technology evolves. The real environmental challenge lies elsewhere.

The Real Issue: Electricity

While water concerns are a bit of a red herring, electricity consumption is AI's genuine environmental challenge.

Cooling that water requires, you guessed it, electricity.

As do all the other data centre operations during training and inference.

Electricity is the problem here. Not water.

I’m going to give you current figures but I highly recommend you research these all up yourself. Not hard with Deep Research modes! These figures are always changing so being current is important.

Here’s the high level summary without getting into the weeds!

As of 2024, data centres already draw about 415 TWh of electricity a year — roughly 1.5 % of all power generated worldwide.

The IEA’s new Energy & AI report says that demand will double to around 945 TWh by 2030, a load comparable to Japan’s entire grid.

Electricity used specifically for AI‑optimised servers is expected to more than quadruple over that same period, making AI the biggest single driver of the surge.

If those projections hold, data‑centre operations could soak up close to 3 % of global electricity within five years.

It’s a lot. And it’s growing. That is undeniable.

Yes, AI is energy-intensive. But how does it compare to other activities we take for granted? This is where it gets interesting.

The problem with decrying AI’s energy usage is that we need to be aware of the larger context.

I always get comments on my TikToks and YouTube videos about AI’s energy usage so this is a good place to begin the comparison.

A single ChatGPT reply is about 0.3watt-hours.

Watching a 5 minute HD Youtube video? 12 watt-hours.

And if that video gets 10 million views? 120M watt-hours, or 400 million AI queries. Video streaming, unlike AI queries we send to ChatGPT, is massively scalable.

What about TikTok and other platforms?

My viral TikTok videos, watched by 10+ million people, have done incalculably more damage to the environment than any AI work I have done. Hell, more than the AI work I’ve inspired other people to do.

We have to be aware of all the other uses of energy. This is why when someone hyper-fixates on AI’s electricity impact it’s important to remind them of all the other uses of power they engage in everyday.

(Yes, training AI rather than inference also uses a lot of power - I’ve left this out for simplicity. But remember that producing TikTok videos, Netflix shows and the like also takes a massive amount of energy too!)

Also, here’s the biggie. Above I mentioned that data centres currently consume 1.5% of the world’s electricity, potentially doubling to 3% in 5 years?

Well…what’s that other 98.5% or 97%?

That’s manufacturing, transportation, construction, consumer electronics etc.

In fact household IT devices alone use double the electricity of data centres!

We don’t hear that much about this remaining 97% because, unlike AI, it’s not emotive.

Industry Response: Not Standing Still

The AI industry has also recognised electricity is a problems.

Companies like Microsoft have signed nuclear deals whilst Google is going for geothermal.

At the same time model efficiency is always improving. Basically newer models achieve better results with less energy requirements. Hardware improvements continue to reduce power per computation.

Don’t get me wrong - all of this isn’t out of the goodness of their hearts.

It’s about profit.

AI is power hungry as we have seen. So the company with the cheapest access to electricity (by, you know, building your own nuclear power stations) and the most efficient models will have the lowest costs.

Lower the electricity bill and the company profits more. These companies may wrap their initiatives up in environmental clothing but there is raw profit at the root of their decisions.

Which, for better or worse(!), is arguably is a much stronger force than their sense of responsibility. 😆 

Practical Steps for Businesses

All this is well and good. But if you are discussing energy usage with an audience or client it’s even better to give them solid steps they can actually take.

Some practical steps:

1. Right-Size Your Models Don't use GPT-4o for tasks that GPT-3.5 can handle. Model selection significantly impacts energy use.

2. Optimise Token Usage Be concise with prompts and context. Every token processed consumes energy. As Sam Altman has told us - stop saying please and thank you!

3. Strategic Timing and Location

Run intensive tasks during off-peak hours when grids often have cleaner energy mix. Train models in regions with cleaner energy (France's nuclear grid is a popular choice in Europe.) Here’s a handy map to help here.

4. Choose Green Providers Select cloud providers committed to renewable energy and efficient operations.

5. Deploy Local Models When Appropriate For simple, repetitive tasks, local models can be more efficient than cloud APIs.

This is often the best way to deal with these questions at public talks - basically turn it around on them. Assume that AI is here to stay (it is) and that any company without it is in trouble (they are). And from here segue into helping them limit their impact. This brings the discussion from an emotive “AI is destroying the planet” to a productive “Here’s an actual plan”.

Argument Summary

When addressing AI's environmental impact:

If water consumption fears are based on outdated understanding of cooling systems,

Then focusing on electricity usage provides a more accurate picture of AI's environmental impact.

If AI currently represents a tiny fraction of global energy consumption (1-3%),

Then environmental concern should be proportional, not panicked.

If AI companies have profit motives to improve efficiency (energy costs money),

Then market forces will naturally drive continued improvements in energy efficiency.

If we're concerned about AI's energy use,

Then we should apply similar scrutiny to all our digital activities—streaming, gaming, social media—not single out AI.

Therefore AI's environmental impact requires balanced attention: real concerns addressed through practical solutions, not dismissed or catastrophised.

This is my perspective— as always adapt it based on your industry context and local environmental priorities.

There are genuine concerns in play here. We just need to be sure to focus on the real issues rather than the headline de jour.

What's Next?

Next up we'll tackle our final curveball question: "Is AI development moving too fast?" This question touches on safety, regulation, and the tension between innovation and responsibility.

I'll share how to address concerns about AI's somewhat scary breakneck pace while acknowledging the competitive (and, gulp, geopolitical) realities driving rapid development.

Keep Prompting,

Kyle

When you are ready

AI Entrepreneurship programmes to get you started in AI:

70+ AI Business Courses
✓ Instantly unlock 70+ AI Business courses ✓ Get FUTURE courses for Free ✓ Kyle’s personal Prompt Library ✓ AI Business Starter Pack Course ✓ AI Niche Navigator Course Get Premium 

AI Workshop Kit
Deliver AI Workshops and Presentations to Businesses with my Field Tested AI Workshop Kit  Learn More

AI Authority Accelerator 
Do you want to become THE trusted AI Voice in your industry in 30-days?  Learn More

AI Automation Accelerator
Do you want to build your first AI Automation product in 30-days?  Enrol Now

Anything else? Hit reply to this email and let’s chat.

If you feel this — learning how to use AI in entrepreneurship and work — is not for you → Unsubscribe here.