GATO · AI Research Brief
The concern is real. The numbers are big. And the question isn't whether to be worried. it's what to actually do about it. This is that conversation.
Part One · The Concern
Data centers consume enormous volumes of freshwater. The buildout is accelerating. Many of the largest facilities are being planted in regions that are already water-stressed. These numbers don't come from activists. They come from universities, government agencies, and the companies themselves.
The people in rural Georgia whose wells were affected by a nearby facility weren't operating from incomplete information. They were operating from direct, personal experience of harm. That experience is valid. It deserves to be taken seriously, not managed, not redirected, not explained away.
So before we go anywhere else, let's look the numbers in the eye.
"OpenAI's Stargate project is a 100-billion-dollar investment in a single campus in Abilene, Texas, a region where a hydrologist has described every part of the state as facing a water-energy nexus crisis."
— From the peer-reviewed analysis underpinning this pageThe transparency problem is also real. Many data center operators arrive in communities under non-disclosure agreements, concealing projected water and energy consumption from the very residents whose infrastructure will be affected. Communities are being asked to accept trade-offs they have not been given the data to evaluate. That is a governance failure.
Acknowledging all of this isn't pessimism. It's the only honest starting point for what comes next.
Part Two · Context
Comparative harm is not a defense, and we're not offering it as one. But context matters for strategy. If we're going to direct energy effectively, we need to understand the full picture of where water actually goes.
The fact that agriculture uses more water than data centers does not make data center water consumption acceptable in a drought. Comparative harm is not a defense. And data center growth is accelerating. If nothing changes, the curve tilts steeply upward. That's precisely why the solutions below matter.
What context does give us is leverage. Industries that use far more water than data centers are already subject to regulation, public pressure, and innovation mandates. The frameworks exist. The question is whether we're willing to apply them here with the same urgency, and whether we can move fast enough.
Part Three · The Real Debate
We read two rigorous pieces: one optimistic about technology-led solutions, one written as a direct, skeptical rebuttal. What's remarkable is where they converge. Here's the real tension:
5,400 data centers in the US still use legacy evaporative cooling. Retrofitting them is expensive and disruptive. The tech is advancing at the frontier. In any meaningful timeline, it is not replacing what currently exists. And fiscal dependency on data center tax revenue compromises the very regulators we need to enforce change.
The entities with the most to gain from solving water efficiency have the most computational resources to apply to it. Microsoft is already deploying AI systems to predictively manage cooling loads. The technology (liquid cooling, MOFs, seawater systems) exists. The barriers are economic incentives and regulation, not physics.
"I came to dismantle the optimistic case. The structural criticisms hold. But I cannot honestly tell you the technology is going nowhere, and I cannot tell you the policy proposals are impossible. Only that they are hard. Those are different arguments."
— From the skeptic's rebuttal, after examining the evidenceWhere they agree is the most important part: communities bearing the costs of this buildout deserve full information, genuine political leverage in permitting negotiations, mandatory disclosure, and enforceable community benefit agreements. The path forward runs through better policy and smarter community negotiation. The dispute is about the terms of how computation gets built, not computation itself.
Part Four · What's Actually Being Built
The narrative that the industry is ignoring the water problem is approximately two years out of date. The actual frontier of data center engineering has moved aggressively toward alternatives, and several of them are genuinely impressive.
Servers submerged in non-conductive dielectric fluid absorb heat at the source. No evaporation required. This also cuts energy use by 50%. Microsoft has already deployed closed-loop liquid cooling systems at scale.
Metal-Organic Frameworks are engineered crystalline materials that capture water molecules from air, even in arid conditions. The Nobel Prize in Chemistry was awarded in 2025 for the foundational research. Companies like AirJoule and Atoco are now commercializing containerized units that use the data center's own waste heat to pull pure water from the atmosphere.
Commercially AvailableGoogle's facility in Hamina, Finland pumps seawater through titanium heat exchangers, returns it at normalized temperature, and uses zero potable water. Researchers have gone further: cogeneration systems where seawater evaporation simultaneously cools the facility and produces desalinated freshwater, a net-positive water contribution to the surrounding region.
Deployed at ScaleThe thermal exhaust from cooling systems is a resource currently being discarded. District heating programs in Denmark and Finland are already capturing this heat. atNorth's campus outside Copenhagen pipes data center exhaust through a local partnership, heating over 8,000 residences. A data center that warms the neighborhood it occupies is a fundamentally different relationship.
Operational in EuropeMicrosoft's AI systems are already being deployed to predictively manage cooling loads, modeling thermal demand before it occurs and reducing consumption accordingly. The intelligence inside these facilities is being turned on the problem the facilities themselves create. This is not a future promise. It is a current deployment.
In Active DeploymentInstead of cooling air and letting servers warm it back up, liquid coolant circulates directly across the processor. No evaporative loss. No open-air exposure. The closed loop means the same water cycles indefinitely with minimal makeup water required. This is becoming standard in hyperscale new builds.
Adopted by Major OperatorsAmerican municipalities lose 15–40% of their water supply through leaky pipes, a problem no current program is solving at scale. Real-time pipe pressure modeling, AI-driven leak detection, and demand forecasting could recover billions of gallons annually. The computational infrastructure to run this already exists inside every large data center. This is proposable today as a permitting condition.
Technically ReadyPart Five · The Bigger Picture
This is not a rhetorical move. It is a structural observation. The same AI infrastructure generating heat and consuming water is, right now, being used to model climate systems, optimize agricultural water use, design novel materials, and accelerate environmental research. Here is what that looks like concretely:
"The technology that is creating the problem is also the most powerful tool available for responding to it. Using it well, at the individual and community level, is not capitulation. It is the most rational thing you can do."
— The Water QuestionPart Six · The Policy Framework
These are not utopian ideas. They are structurally achievable policy frameworks that several jurisdictions are already beginning to implement. Each one has a working precedent somewhere in the world.
Every data center above 10 megawatts in a municipality contributes at least 1% of gross revenue to a Community Water Infrastructure Fund, administered by an independent local board. A 500-megawatt campus generating hundreds of millions in revenue contributes transformative capital: cost internalization. The $100B Stargate project alone: $1 billion at 1%.
No permit without full public disclosure of projected water consumption, source, return rate, and drought contingency plans. NDAs between operators and local governments must be prohibited in any permitting process involving shared public infrastructure. California's Assembly Bill 93 is an early version of this. It should be the national floor, not the exception.
Any data center sited in a high or extremely high water stress region (per WRI classification) must integrate on-site atmospheric water harvesting sufficient to offset a minimum percentage of cooling water consumption. MOF-based systems from companies like AirJoule are commercially deployable today. Making this mandatory in stress zones accelerates manufacturing scale, driving down costs for every deployment worldwide.
Waste heat from cooling is a resource currently being discarded. Incentivize, and in dense urban areas require, partnerships with district heating programs. atNorth's campus outside Copenhagen already heats 8,000 homes with data center exhaust. A data center that heats the neighborhood it occupies has a fundamentally different relationship with that neighborhood.
Large-scale operators seeking permits in water-stressed municipalities contribute computational resources toward optimizing the municipality's water system: real-time pipe pressure monitoring, AI-driven leak detection, demand forecasting, drought scenario planning. Most mid-sized American cities cannot afford to build this independently. Every large data center can run it as a trivial side task. This converts the facility from a water consumer into a water management partner.
Part Seven · Your Leverage Points
The framing that positions individuals against data centers directs energy at the wrong fulcrum. The concern about data centers is legitimate. Community leverage inside the permitting process is where that concern can actually land. The data centers are not going away. The computation is not stopping. What you can influence is the terms: the community benefit agreements, the permitting conditions, the disclosure requirements, the state legislation, the local zoning decisions.
And there's a more immediate version of the same logic: AI tools available right now can help you research, draft, organize, and reach people at a scale that wasn't previously possible for community advocates.
"The communities adjacent to these facilities are not powerless. They have permitting authority, zoning authority, and the ability to negotiate community benefit agreements as a condition of local approval. Several communities are already doing this. The model exists. It needs to spread."
— The Water QuestionEven accepting the optimistic technology trajectory, three things are not negotiable: mandatory public disclosure of water consumption as a permitting condition; prohibition of non-disclosure agreements in any permitting process involving shared public infrastructure; and enforceable community benefit agreements for all facilities above 10 megawatts in water-stressed regions. These are not technology questions. They are governance questions. They can be implemented now.
Better policy. Smarter community negotiation. And putting the intelligence inside these machines to work on the problems they are generating. That is the highest-leverage thing you can do.
Explore the GATO Platform Read From the TopMore Research from Gato
The positive case for AI adoption. The leverage, the numbers, the people already building. What's actually possible when engagement replaces resistance.
Read the article →A serious response to the most serious objections: humanity, resources, artists, power, and the people who built the tools. Gato addresses the objection he agrees with most.
Read the article →