While artificial intelligence’s energy consumption is enormous, its covert usage of water has seemingly flown under the radar, with the need for water-intensive cooling systems that create a freshwater footprint of equal size.
Similar to the backlash crypto mining received for its heavy energy usage, concern about the ecological ramifications of AI data centers is now drawing its share of negative press.
Sparking fears of a water management crisis, we’ll examine why AI has developed such a thirst for water, the potential environmental repercussions, and whether AI’s own algorithms can help make itself a more sustainable entity.
Because water might be everywhere, but we should be careful about sparing a drop to think.
Key Takeaways
- AI’s water footprint is equal to its thirst for energy as data centers demand 24/7 water-based cooling to cope with its exponential demand.
- A single 100-word response from ChatGPT uses approximately half a liter of water.
- AI’s water usage is could hit up to 6.6 billion m³ by 2027 — which is about half as much water as the UK uses in a year.
- Relocating AI data centers to cooler, wetter regions could alleviate strain on local water supplies.
- Real-world solutions include relocating AI data centers to wetter, cooler regions and AI-optimized water cooling systems R&D.
What is AI Water Usage, and Why Are Data Centres So Thirsty?
In short, without water, artificial intelligence-supporting data centers would simply overheat and self-combust due to the ever-increasing demand and complexity of AI’s workload.
When you consider OpenAI’s ChatGPT requires more than half a liter (50cl) of water to generate a single 100-word answer – equal to around five short sentences – you can see where the problem lies.
Given that ChatGPT already has over 180 million users, the resulting demand for freshwater consumption is immense.
AI’s water usage is projected to hit 6.6 billion m³ by 2027 — about half as much water as the UK uses in a year.
Drawing extensively from regional water supplies, data center cooling systems are essential for keeping AI servers at an ideal operating temperature of circa 70°F, 24 hours a day.
As a result, AI hubs require extensive, 24/7 cooling, often done through water-fuelled air conditioning or cooling towers, not to mention the secondary effects if energy is generated through coal, gas, or nuclear power plants.
Needless to say, both processes tax finite local water resources such as reservoirs, rivers, and subterrain water tables, which in some regions are already dwindling due to climate change.
Environmental Impact of AI Water Consumption
Despite AI’s double threat of excessive energy and water consumption, its perhaps the latter’s direct impact on the local economy and ecological environment will be the most visible consequence.
Though climate change’s full impact is still not wholly understood, AI’s immense freshwater footprint impacts local communities’ resources in areas currently suffering from drought or water scarcity due to altered weather patterns.
In states such as Texas and California – AI data center hotspots – locals are already feeling the pinch from lower-than-expected rainfall, and the growing number of tech hubs will, in all likelihood, only deplete their water resources further.
Here, residents and water-dependant industries such as farming and agriculture may, at best, soon see their water rates rising – but in some worst-case scenarios, possibly see their water supplies drying up entirely due to over-demand.
Solutions and Innovations to Reduce AI’s Water Footprint
For all the doom and gloom associated with AI’s ever-increasing demand for our natural resources and energy supplies, perhaps AI itself could prove to be its own savior.
Putting its complex algorithms to good use, machine learning (ML) could help with aspects such as water demand forecasting, optimizing water recycling systems, and even leak detection, all of which could help reverse AI’s one-way relationship with water consumption.
Additionally, AI could also assist in the research and development of more advanced, uber-efficient air cooling techniques, such as large-scale immersion cooling, as being done by Iceotope and Green Revolution Cooling (GRC).
However, perhaps the burden should ultimately fall on the tech giants themselves.
This could involve relocating AI data centers to locations with an almost untappable water table or basing their operations in cooler locations, as is being pioneered in Iceland, Finland, and Norway.
One thing is for sure: the demand for AI is growing, and the water cost is an invisible one for end users — if you leave a tap running at home, you may notice it on your water bill. But pounding ChatGPT and other models with queries will be an invisible cost.
The Bottom Line
Whether it’s the lack of awareness of AI’s extensive freshwater footprint or the belief that the consequence of its insatiable demand only affects those living near these massive AI data centers — something has to change if we’re to lessen AI’s unquenchable thirst.
The uncertainty of global warming certainly isn’t helping, as reliable cyclical weather patterns are disrupting the traditional water cycle, causing droughts across the planet.
While efforts are made to optimize data center cooling processes, perhaps this is where AI needs to step in.
Implementing technology’s own algorithms to work towards a solution to a problem of its own making is a step in the right direction. Still, for those already grappling with the moral consequences of AI’s thirst, a solution cannot come fast enough.