AI’s Thirsty Ghost

We talk about AI like it’s weightless.

“Just in the cloud.”
“Running in the background.”
“Ask the model, it’s basically free.”

But behind every “free” AI answer is a very physical, very thirsty machine. Racks of hot chips. Fans and pipes. Cooling towers dumping heat into the air and water into the sky.

And all of that is sitting on a planet that is already heading toward a massive freshwater shortfall by the end of this decade, with experts warning that global freshwater demand could exceed sustainable supply by around 40% by 2030.

In this piece I want to do three things:

  1. Show how AI’s water footprint fits into the wider global water crisis.
  2. Explain why cooling + climate change is a double bind we can’t ignore.
  3. Make the case that, long-term, we should treat Earth as a residential zone and push heavy computation and industrial infrastructure off-world.

It’s a big claim. But if you accept that water is non-negotiable for life, the conclusion starts to feel less sci-fi and more like basic zoning law for a planetary civilisation.

AI’s Thirsty Ghost

AI’s Thirsty Ghost: Why We Need to Move Big Tech Off-World and Zone Earth “Residential”

Big Nose Knows… water is life, not bandwidth.

We talk about AI like it’s weightless.

“Just in the cloud.”
“Running in the background.”
“Ask the model, it’s basically free.”

But behind every “free” AI answer is a very physical, very thirsty machine. Racks of hot chips. Fans and pipes. Cooling towers dumping heat into the air and water into the sky.

And all of that is sitting on a planet that is already heading toward a massive freshwater shortfall by the end of this decade, with experts warning that global freshwater demand could exceed sustainable supply by around 40% by 2030 (World Economic Forum; World Green Building Council).

In this piece I want to do three things:

  1. Show how AI’s water footprint fits into the wider global water crisis.
  2. Explain why cooling + climate change is a double bind we can’t ignore.
  3. Make the case that, long-term, we should treat Earth as a residential zone and push heavy computation and industrial infrastructure off-world.

It’s a big claim. But if you accept that water is non-negotiable for life, the conclusion starts to feel less sci-fi and more like basic zoning law for a planetary civilisation.


1. AI is not “virtual” – it’s concrete, steel and a lot of water

Let’s strip away the marketing.

An AI model doesn’t “live in the cloud.” It lives in:

  • Data centres (warehouses full of servers),
  • Plugged into power stations,
  • Wrapped in cooling systems that move heat into air and water.

The energy side

According to the International Energy Agency’s “Energy and AI” report and related analysis, data centres currently use around 1–2% of the world’s electricity, and electricity demand from data centres worldwide is projected to more than double by 2030 to around 945 TWh, with AI as the main driver (IEA news release).

That’s roughly a Japan-sized country’s worth of electricity just to keep our digital life humming.

The water side (the part almost nobody talks about)

Water is where it gets quietly brutal.

Analyses summarised by the OECD and academic researchers suggest that AI’s global annual water demand could reach 4.2–6.6 billion cubic metres by 2027 just for cooling and power generation — more than the total annual water withdrawal of several small countries combined (OECD: “How much water does AI consume?”; University of Illinois: “AI’s Challenging Waters”).

To translate that:

  • We’re talking in the ballpark of multiple small countries’ annual water use, just to cool and power digital infrastructure.
  • Most of this is indirect – water used to generate electricity – plus direct water for cooling towers and chillers.

On a planetary spreadsheet, AI is still a small slice compared to agriculture. But:

  • It’s growing extremely fast.
  • It’s concentrated in specific basins: one hyperscale data centre can put real pressure on a local aquifer or river. In drought-stricken regions, proposed AI and cloud campuses are already sparking public backlash (example: datacentres in drought-hit Latin America).

So every time we add a new “AI campus” in a water-stressed region, we’re effectively saying:

“We’re okay converting drinking water and river water into heat so we can autocomplete emails faster.”

That choice might be defensible in a wet, cool climate with abundant renewables. In a drying, heating world, it’s a different story.


2. The bigger background: a looming water gap

AI isn’t creating the water crisis. It’s piling onto it.

Multiple UN-linked reports and global commissions on water economics are saying essentially the same thing:

  • Global freshwater demand is rising so fast that it is projected to outpace sustainable supply by around 40% by 2030 without major reforms (WEF summary; Global Commission on the Economics of Water).
  • More than two billion people already live in countries where water supply is inadequate, and roughly four billion people experience severe water scarcity for at least one month a year (UNICEF: Water scarcity).
  • By 2030, up to 700 million people could be displaced by intense water scarcity (again from UNICEF’s projections).

At the same time, analysis from the World Green Building Council suggests that the built environment already accounts for roughly 15% of global freshwater use (“The Water Paper”).

The main culprits behind the broader water crisis are:

  • Agriculture (by far the biggest user),
  • Electricity generation,
  • Industry and the built environment.

AI plugs into two of those:

  1. It rides on top of electricity (which is often water-hungry).
  2. It lives in buildings and campuses that are part of the built environment, already responsible for a significant share of global freshwater use.

So we’re adding a rapidly expanding, high-status water user into a system that is already overdrawn.

We’re not arguing about luxuries at the margins. We’re arguing, quite literally, about who gets to drink, grow food, and stay put.


3. Cooling: where bits meet boiling point

Why does AI need so much cooling?

Because modern GPUs and accelerators are basically little suns. Cram millions of them into racks, run them 24/7, and you have:

An industrial heat problem masquerading as “cloud computing.”

Most large data centres use a combination of:

  • Air cooling (blowing huge volumes of conditioned air through hot racks),
  • Chilled water loops (circulating water to absorb heat),
  • Evaporative cooling towers (where water is literally evaporated to shed heat).

Evaporation doesn’t just “borrow” water. It consumes it. Once it’s steam in the sky, it’s gone from that basin (at least in any useful, predictable way).

As AI workloads scale up, operators face a choice:

  1. Use more evaporative cooling (water-intensive, more energy-efficient), or
  2. Use more closed-loop chillers (less water, more electricity),

…or move into liquid immersion and other exotic cooling, which again has water and energy implications in the wider system.

Right now, a lot of these decisions are being made on cost and latency, not on water ethics.


4. Climate change makes every litre more political

All of this is happening inside a climate system that we’ve already destabilised.

By around 2030, we’re very likely to be hovering near +1.5°C of global warming relative to pre-industrial levels, with current policy settings pointing toward substantially more by 2100 (UN SDG 6 overview).

What does that mean for water?

  • More intense droughts in already dry regions.
  • Flashier floods, which are terrible for infrastructure and not great for reservoirs.
  • Snowpack and glaciers shrinking, removing nature’s slow-release water tanks.
  • Higher evaporation rates from dams, rivers and soils.

Meanwhile, analysis from the World Resources Institute and others shows that 25 countries already face “extremely high” water stress, together home to roughly a quarter of the world’s population (WRI Aqueduct; Earth.org summary).

The same regions that tech companies like for data centres — cheap land, growing cities, sometimes lax regulation — are often regions where water stress is climbing.

It’s a tight little loop:

  1. We burn energy, heat the planet, and destabilise water cycles.
  2. We build AI and data centres to help “optimise” responses to that chaos.
  3. The cooling of those data centres puts more stress on local water systems, which climate change is already stretching.

There’s a very Big Nose question hiding in there:

At what point does this stop being “innovative” and start being obviously self-harming behaviour?


5. “But we can make AI greener, right?”

Yes, and we absolutely should. There are smart people pushing:

  • Circular water systems in data centres (reuse, recycling, on-site treatment and replenishment), as highlighted in recent work on data centre water circularity (World Economic Forum).
  • Locating data centres in cooler, wetter regions with renewable power.
  • Shifting workloads in time and space to match green power and water availability.
  • Better measurement and transparency: publishing water-per-query numbers, not just vague “we’re carbon neutral” slogans.

We should push hard on all of this.

But there are two uncomfortable realities:

  1. Demand growth is outpacing efficiency.
    Even if each AI query gets “less thirsty,” the total number of queries is exploding, and big models are getting bigger. Electricity and water use can still go up while per-unit footprints go down (the IEA explicitly flags this in its projections for AI-driven electricity demand).
  2. Ethical scarcity is local, not global.
    It doesn’t matter that some other river on the other side of the planet has plenty of flow. What matters is the basin where your data centre lives — and whether the people, farms and ecosystems there can spare the water you’re evaporating.

So yes, “green AI” is essential. But efficiency tweaks inside the same mental model (“infinite digital growth on a finite wet rock”) are ultimately just that: tweaks.

We need to shift the frame.


6. Earth as a residential zone, not a server room

Zoning laws exist in cities for a reason.

We don’t let heavy chemical plants set up shop in the middle of residential streets. Not because chemistry is evil, but because living bodies are fragile, and we’d like kids to grow up breathing something other than fumes.

It’s time to apply the same logic to the planet as a whole.

A simple idea

  • Treat Earth primarily as a place for life: humans, animals, plants, ecosystems.
  • Treat heavy industry and hyperscale computation as activities that should gradually move off-world wherever physics and economics allow.
  • In other words: Earth zoned residential; deep space zoned industrial and computational.

This isn’t about magically lifting every data centre into orbit next year. It’s a direction of travel:

  • As we invest in AI and computing, we ask: Where should this ultimately live?
  • As we put billions into chips, we put serious money into space-based power and compute too.
  • As we regulate AI, we also regulate its water rights and tie them to a long-term off-world migration.

“Move tech off-world” stops being a sci-fi daydream and becomes a planning principle, like banning asbestos or phasing out leaded petrol.


7. What could off-world compute actually look like?

Let’s stay grounded and specific.

Some parts of AI absolutely must stay on Earth:

  • Latency-critical systems (surgery robots, autonomous vehicles, real-time control).
  • Devices that need local processing when networks fail.

But a huge amount of compute is batchable:

  • Model training and retraining,
  • Bulk simulation and research workloads,
  • Massive offline analytics,
  • Archival storage and long-term indexing.

Those jobs don’t care if there’s a few hundred milliseconds or even seconds of latency.

For that slice, you can imagine:

  1. Solar-powered compute in high Earth orbit or cislunar space
    Solar intensity is higher and more stable; waste heat can be shed into space without boiling rivers; you beam down only the results, not the heat.
  2. Lunar or asteroid-based facilities (longer term)
    Use in-situ resources for construction; keep the dirtiest parts of the chip supply chain off the biosphere; again, heat goes into vacuum, not into aquifers.
  3. Hybrid architectures
    Keep a thin “edge intelligence” layer on Earth for responsiveness; push heavy lifting to off-world “compute farms” as they come online.

Is this technically easy? No.
Is it crazier than assuming we can keep adding planet-scale heat and water loads to fragile local systems indefinitely? Also no.

We already accept enormous engineering challenges for energy (offshore platforms, deep-sea drilling, fusion research). Doing the same for water-neutral, off-world computation is at least in the same ballpark.


8. The steps between here and there

You don’t go from a chatbot to “Lunar Compute Park v3.0” in one jump.

But you can start aligning today’s decisions with that end state.

1. Radical transparency

  • Mandatory disclosure of water withdrawals and consumption for data centres and AI workloads, basin by basin.
  • Clear, comparable metrics (water per query, water per training run, etc.).

No more “trust us, it’s green-ish.”

2. Water-aware regulation

  • Hard limits or moratoriums on new AI-heavy data centres in water-stressed catchments.
  • Linking AI expansion to proven investments in local water resilience: leak reduction, reuse plants, wetland restoration, and so on.
  • Treating water as a commons with rights, not just an industrial input.

3. Planetary zoning mindset

  • Updating national AI and digital strategies to include water and climate constraints, not just GDP and “innovation”.
  • Funding programs explicitly aimed at off-world compute and space-based solar: moving the heat away from home.
  • Making “Earth zoned residential” a serious policy conversation, not just a sci-fi convention panel.

4. Culture: change what we celebrate

As consumers, voters and creators we can:

  • Stop treating infinite, frictionless AI as the only acceptable user experience.
  • Start celebrating “slow where it matters”: models that are water-aware, region-aware, and don’t try to answer every trivial prompt at planetary scale.
  • Ask uncomfortable questions of tech companies:
    “How much water did this feature cost?”
    “Where does your cooling water come from, and who else needs it?”

If a feature’s answer is “we’re evaporating drinking water in a drying basin so your feed loads 0.2 seconds faster,” maybe that’s not a feature worth having.


9. Choosing what sort of civilisation we want to be

AI is not evil.
Data centres are not evil.
Space industry is not evil.

They are just tools.

The question is whether we:

  • Keep bolting them onto a civilisation that already overshoots its water budget, or
  • Re-architect the whole thing so that our intelligence and our infrastructure stop cannibalising the conditions for life.

From a Big Nose point of view, the story almost writes itself:

  • We are staring down a world where freshwater demand may exceed sustainable supply within a decade.
  • At the same time, we are cheerfully building an AI infrastructure that will drink billions of cubic metres of water a year, even as hundreds of millions face water stress and displacement.
  • We do have alternatives — from circular water systems to radical efficiency, to eventually lifting the hottest, thirstiest parts of our compute stack into space.

The sane response isn’t to shut down AI or retreat into caves.

It’s to say:

“Earth is for living things. If our machines want to run hot, they can do it where there’s no river to boil.”

Big Nose Knows…
If we treat water as negotiable, everything else becomes negotiable too.
If we get the zoning right — Earth for life, space for the heavy lifting — we might actually get a future where AI helps us thrive without quietly drying out the planet under our feet.