pelicanweblogo2010

Mother Pelican
A Journal of Solidarity and Sustainability

Vol. 21, No. 9, September 2025
Luis T. Gutiérrez, Editor
Home Page
Front Page

motherpelicanlogo2012


The Myth of Infinite Growth

Kumbirai Thierry Nhamo

This article was originally published on
Zealous Thierry, 8 August 2025

REPUBLISHED WITH PERMISSION



Google’s data center in The Dalles, Oregon, about four miles downstream from the Columbia River dam that provides hydropower to the region. Photo by Tony Webster via Wikimedia Commons. Click the image to enlarge.


The tide of optimism about technology has swelled to a roar in the past few years. Every month brings news of a faster chip, a smarter machine or a grand vision of AI solving our problems.  The prevailing creed assumes innovation has no bounds, and that each breakthrough simply opens new frontiers.  But this bright faith in endless growth collides with a more sober reality.  Our world has physical limits.  The towering server farms and AI centers we build consume vast water and power resources.  To believe we can march on without consequences is to ignore those limits.

Data centers now guzzle electricity and water on an extraordinary scale.  In the United States, the growth of AI and cloud computing has driven data center energy use into the stratosphere. 

A new report from the Department of Energy shows U.S. data centers sucked down about 176 terawatt-hours of electricity in 2023 – roughly 4.4% of the nation’s total power use.  That’s enough to power millions of homes for a year.  By 2014 the figure was just 58 TWh. 


Data centers are claiming a large share of power use. Click on the image to enlarge.

In only a decade, power draw tripled.  And experts say this trend is only accelerating: data center demand could double or even triple by 2028.  Under these projections, data centers could consume up to 325–580 TWh by 2028 – a rise so dramatic it may soon strain the entire grid. 

To grasp the scale: one analysis finds AI alone might require 300 TWh per year by 2028, enough electricity for 28 million U.S. households.  The servers behind every ChatGPT query and every cloud process are drinking deep from the national power supply.

That power comes at a price.  Most of the electricity reaching these centers still comes from fossil fuel plants, so the carbon emissions are enormous.  In fact, Stanford researchers report U.S. data centers emitted some 105 million tons of CO₂ in the past year – about 2.18% of that huge nation’s total carbon emissions. 

To put it another way, the modern “cloud” is heavier in emissions than many entire industries.  (For reference, Berkeley Lab finds data centers account for roughly 0.5% of all U.S. greenhouse gases.)  These figures are not a byproduct of some secret practice but ordinary consequences of running massive computing loads on a dirty grid.

Meanwhile, water is being overlooked in the frenzy.  Data centers need water to cool their blazing hot processors.  It’s easy to forget that “the cloud” is as physical as a river or an aquifer.  Cooling systems often rely on evaporative chillers or towers that use about two liters of water for every kilowatt-hour of energy consumed. 

Last year, U.S. data centers collectively used 66 billion liters of water (about 17.4 billion gallons) – triple the volume of 2014.  In plain terms, the United States of America’s server farms are now gulping water by the tens of billions of gallons per year.

Think of it locally: in the Pacific Northwest, Google’s massive Dalles, Oregon data center has come to use nearly a quarter of the city’s water supply.  When activists forced Google to reveal its numbers, it turned out the center withdrew 355 million gallons in 2021. That is roughly 25% of the town’s annual use.  Similarly, Food & Water Watch reports that in 2022 alone, Google, Microsoft and Meta together used about 580 billion gallons of water for data center power and cooling.  That number is almost unimaginable, roughly enough to satisfy 15 million U.S. households for a year.  And it keeps rising: one projection suggests American AI facilities could need 720 billion gallons annually by 2028, enough to meet the indoor water needs of 18.5 million homes.

All this comes at a moment when water scarcity is already a global crisis.  The U.N. reports that about 2.4 billion people lived in water-stressed countries in 2020.  One in ten people worldwide now face high water stress, meaning supplies are dangerously low compared to demand.  Every extra gallon poured into server cooling is a gallon not flowing to farmers, or households, or ecosystems. 

Data center operators argue they recycle much of their water, but many sites do not return the lion’s share back to local sources.  Google alone discharges only 20% of water it withdraws, losing 80% to evaporation.  In dry regions this can drive wells empty.  In Arizona and Oregon, for example, officials have had to scramble to expand storage or fallow farms so that high-tech facilities can quench their thirst.

In short, the digital revolution isn’t virtual when it comes to resources.  Power and water are real, finite commodities, and they have to come from somewhere.  The techno-optimist narrative assumes new technologies magically appear without material costs, but the data tells a different story.  Generative AI models are notorious power hogs: training just one model the size of GPT-3 consumed 1,287 megawatt-hours – enough to power about 120 American homes for a year – and released around 552 metric tons of CO₂.  And that’s only the training phase, not counting the ongoing cost of millions of user queries.  In fact, even simple chatbots can be energy gluttons: each ChatGPT query consumes about five times the electricity of a web search.  The computing world is racing to larger models, but the energy and water footprint only grows.

Meanwhile, every kilowatt-hour burnt to pump that water or power those chips tends to come from a still-carbon-heavy grid.  The companies promise renewables, but the reality is mixed at best.  Big tech may sign green energy contracts, but they still draw on local generators.  And those often mean natural gas turbines or even coal plants working overtime. 

One recent analysis predicts only about 40% of new data center power will be covered by renewables up to 2030.  The rest comes from fossil fuels.  In fact, as we build more AI servers, some grid operators are literally turning to generators and coal to keep the lights on.  In Salt Lake City, for instance, local officials recently cut investments in wind and extended the life of old coal plants just to meet surging data-center demand.

This collision of fast innovation with slow ecological realities has a price tag.  The climate impacts of our tech dreams compound other societal costs.  Scientists estimate that climate change already racks up hundreds of billions of dollars in damage each year.  For example, one study finds that U.S. extreme weather disasters attributable to climate change cost about $143 billion per year

Another analysis suggests the world will face between $1.7 trillion and $3.1 trillion in annual losses by 2050 if emissions go unchecked.  Those costs are borne in flooded homes, ruined farms and dead children and not in tech stock portfolios.  If AI expansion means burning more carbon and straining water supplies, those real losses will only mount.

It’s easy in tech circles to assume that these problems will “work themselves out” – that innovation can keep pace with need, or that some carbon-neutral miracle is around the corner.  But that blind confidence ignores hard limits.  Physics tells us that data centers and chips require energy, and water for cooling. 

There’s no escaping that in the real world, each new server farm is another drain on land and sky.  Even if semiconductor density doubles or chips get more efficient, our collective appetite for computation may outstrip those gains.  As one planning official put it at a city meeting, when asked whether endless data centers were sustainable, he warned simply: “I think we have reached a limit.”.

What is striking is the global asymmetry.  In wealthy cities, leaders hail AI as the next engine of growth.  In many poorer regions like my country Zimbabwe, people are already living with the consequences of environmental strain.  Consider drought in rural areas where water is life-and-death: it feels almost surreal to see some of the world’s richest companies staking their future on technologies that can dry up local water.  I have seen communities where children walk several kilometers for a bucket of water. In these places, water is more precious than gold.  This perspective colors my worry: how long can we afford to prioritize rapid innovation over basic resource security?

Yet, the data world remains fixated on endless scaling.  We bill cloud computing as weightless and elastic, but the wires and pumps behind the scenes are very much bound to planet Earth.  The same chips that promise miracles also require rare minerals to mine and mountains to flatten.  For every transistor we celebrate, someone somewhere is digging out water or coal.  A single high-end GPU may have a manufacturing carbon footprint much larger than a simple CPU.  GPUs are flying off the shelves – nearly 3.85 million shipped to data centers in 2023 – meaning ever more mining and manufacturing emissions upstream.

The choice, then, is not between technology and nature, but how to balance them.  We can keep pushing for breakthroughs, but the question we overlook is: at what cost?  If we proceed as if resources were infinite, we risk undermining the very systems we depend on.  For example, many clean tech solutions (desalination, hydrogen, massive batteries) all require lots of energy and materials too.  The golden assumption that there’s a never-ending “innovation frontier” waiting to save us must be questioned.

A perspective shaped by connection to nature urges humility.  In some parts of the world, people say the Earth is not inherited from our parents but borrowed from our children.  If that rings true anywhere, it is on the plains and aquifers already feeling stress.  We cannot maintain an insatiable demand for “faster, bigger, smarter” without accounting for who pays the price – and that price is increasingly paid in flood insurance claims, drought-relief funds and depleted reservoirs.

We’re not arguing against progress.  Clearly, AI and computing have brought benefits, from medical breakthroughs to economic opportunities.  But just as we would never decouple food from the cost of farming, we should not decouple digital growth from environmental cost. 

Each fantasy of data nirvana still requires acres of steel, kWh of electricity and gallons of water.  If we acknowledge those needs, we can plan accordingly.  Some steps are already happening: major tech firms are pledging “water positive” goals and using recycled water for cooling, and some cities now require new data centers to use renewable power.  These are good starts.

Yet policy and perspective must catch up with the scale.  The country that leads in AI must also lead in sustainable practices.  That means measuring and reporting water use, investing in carbon-free power, and thinking long-term about growth.  For example, if U.S. data centers are expected to consume up to one-tenth of all electricity in a few years, they will need a radical shift in their grid’s makeup – or else the emissions will explode.  The most powerful country on the planet should set standards now, lest the cheapest path remains simply “burn more fossil fuel.”

Technology thrives on information and context.  It’s ironic that we are so quick to apply AI to everything, yet turn a blind eye to its own environmental footprint.  The “cloud” won’t solve climate change; it’s part of the climate problem.  Treating data and computation as weightless commodities is a dangerous illusion.  We must bring the same data-driven seriousness to ecological constraints as we do to digital metrics.

In the end, there’s no mystery to solve: the Earth has finite water, finite land, and a finite atmosphere.  The idea that we can keep multiplying digital power forever while ignoring those boundaries is, frankly, a mythology – however appealing.  If history teaches anything, it’s that treating nature as inexhaustible invites crises.  So let’s be realists as well as optimists.  We can push boundaries of AI and technology, but we cannot be blind to the world that sustains it.  Limiting growth does not mean halting innovation.  It means innovating within the laws of nature.

As a global community, our challenge is to align the dazzling possibilities of technology with the hard facts of climate and ecology.  We must ask: what good is a superpowerful computer if there’s no safe world left to use it in? 

In my part of the world, people often say that if you want to know the truth about the sky, you look at the stars; if you want to know the truth about yourself, you listen to the wind.  Perhaps if we listen to the Earth – to dried-up rivers and overheated nights – we’ll recognize the truth in the data.  That truth says our planet has limits.  And acting like we have endless room to run tech at full throttle is a luxury we can no longer afford.


ABOUT THE AUTHOR

Kumbirai Thierry Nhamo is an independent social justice activist, writer, researcher, and social commentator. He is also a poet, a blogger (Zealous Thierry), and is currently studying Fabrication Engineering at a polytechnic in Zimbabwe.


"We make the future different by
making the present different."


Peter Maurin (1877-1949)

GROUP COMMANDS AND WEBSITES

Write to the Editor
Send email to Subscribe
Send email to Unsubscribe
Link to the Group Website
Link to the Home Page

CREATIVE
COMMONS
LICENSE
Creative Commons License
ISSN 2165-9672

Page 7