skip to main content

Transparency on the Water Demand of AI Infrastructure: What Reporting Requirements Should Policymakers Be Focusing On?

Transparency on the Water Demand of AI Data Centers: What Reporting Requirements Should Policymakers Be Focusing On?

by Sara Kassir, LCSSP Research Policy Lead

April 14, 2026


A growing point of contention about data centers is that communities often have little visibility into how the local siting of these facilities can affect public resources. In response, over a dozen state legislatures are actively considering laws that would mandate reporting on their power and water use. But a closer look at these proposals reveals a telling lack of consensus about what information is actually worth collecting — and recent Caltech research suggests that many of the bills may be oriented around the wrong metrics entirely.


Municipalities throughout the United States are at something of a crossroads when it comes to data centers. The public clearly wants more information about how these facilities affect local environments and community resources. But officials often don't know what questions to ask. In hundreds of cities, towns, and counties, policymakers are being tasked with making major decisions about proposed projects while simultaneously trying to develop an evaluative framework in real time.

From Invisible Demand to Public Scrutiny

Water usage illustrates this problem particularly well. Data centers have always required large volumes of water for cooling, but until quite recently, the specifics of this demand were rarely scrutinized. That began to change around 2021 and 2022, when record-breaking drought conditions across the Western United States drew public attention to the water consumption of large industrial users. Around the same time, researchers at Virginia Tech and Lawrence Berkeley National Laboratory published the first estimate of U.S. data center water consumption, finding that at least 20% draw from moderately to highly stressed watersheds. 

As data centers continue to proliferate, it's unsurprising that every community evaluating a proposed facility is demanding assurance that the new development won't strain local utilities. But while the underlying concern is easy to understand, determining whether it's warranted in any given case is not so straightforward. Water accounting - or the systematic process of analyzing water stocks, flows, demand, and consumption - has long been recognized as a difficult technical challenge among resource management experts, made even harder by the extreme decentralization and fragmentation of utilities systems in the United States. With research on modeling the sustainability of AI infrastructure still in its early stages, water accounting in the context of data centers is an even less settled question.

A Patchwork of Legislative Proposals

The general uncertainty about how to gauge AI's water demand is well illustrated in a wave of legislative proposals that have emerged in the last year. As shown in the attached analysis, over a dozen states have introduced bills aimed at increasing transparency around the community impacts of data centers, with reporting requirements on water usage being a popular feature. Despite the similarities in how these proposals are rhetorically presented, they showcase a telling lack of consensus about what information is actually worth reporting.  For example, while some states are asking only for annual reports on total volume consumed, others want details pertaining to where the water is coming from, whether it's reclaimed or potable, and how it's being used within the facility. 

The variation across state proposals is not necessarily a problem in its own right. What many of the bills leave unresolved is whether the information being reported would actually help a community judge if a proposed facility is compatible with its local water constraints. In the absence of a clear connection between what is being reported and what communities ultimately want to know, greater transparency does not necessarily lead to improved decision-making capacity.

The Importance of Peak Demand

Recent work co-authored by Caltech's Adam Wierman and UC Riverside's Shaolei Ren helps clarify what's missing. In Small Bottle, Big Pipe: Quantifying and Addressing the Impact of Data Centers on Public Water Systems, the authors draw a distinction that most current reporting proposals overlook: data centers don't use water the way other customers do. Their demand is highly "spiky," meaning it's relatively modest for most of the year, but surges on the hottest days. The authors find that a data center's peak daily water demand can be 3 to 8 times higher than its annual average, and in some cases the ratio exceeds 10.

The paper's title captures the core problem. The "Small Bottle" — a data center's total annual water consumption — can appear quite manageable, sometimes representing less than 1% of a local system's yearly throughput. But the "Big Pipe" — the peak capacity that the system must be able to deliver on the hottest days — is what actually determines whether existing infrastructure can handle a new facility. Public water systems are engineered around maximum demand, not annual averages. A facility that looks modest in yearly totals may nonetheless require costly infrastructure upgrades to accommodate its summer peaks, precisely when the system is already under the most stress.

Mapping the Gaps in Current Transparency Proposals

The current map of state proposals, illustrated below, makes this gap concrete. Of the bills introduced in the first several months of 2026, over half fail to mention peak or maximum water demand in the scope of their reporting requirements.

This is not to say that the bills omitting peak demand are without value. States may have legitimate reasons to seek other forms of disclosure, including better understanding aggregate resource use, informing utility tariff design, or reassessing the structure of existing tax incentives. But those are different goals from helping a community determine whether a proposed facility is actually compatible with local infrastructure constraints. In the context of American federalism, decisions about siting still fall largely to municipalities and other local entities with authority over land use, zoning, permitting, and utilities. If the central justification for transparency is that communities need better tools to evaluate local impacts, then reporting regimes should be designed around the information those decision-makers actually need — including peak demand data.

That distinction matters because poorly designed transparency can create the appearance of accountability without meaningfully improving public decision-making. Reporting requirements are not costless. They take time and political capital to enact, require agencies to administer, and depend on institutions having the capacity to interpret and use the resulting data. The current wave of legislative interest therefore presents a rare opportunity. Public concern is high, lawmakers are paying attention, and there is genuine momentum behind the idea that communities deserve more information about the infrastructure being built around them. The challenge is to make sure that this investment produces an information ecosystem that is genuinely decision-relevant at the local level.

Translating Science to Policy in Real Time

One essential part of meeting this challenge is ensuring that emerging scientific insights are incorporated into policy design while these regimes are still being built. The lesson of the recent research is not simply that policymakers should ask for more data, but that they should ask for the right data. That will not happen automatically. If scientists want public institutions to move past the "Small Bottle" of annual averages, they will need to communicate those insights proactively — before transparency requirements harden around metrics that are easier to report than they are to use.

Bridging this gap is central to LCSSP's mission. We are currently building on these research findings by partnering with Institute experts to ensure that technical insights regarding the sustainability of data centers are translated into actionable guidance for policymakers in a timely and effective manner. In the near term, we look forward to publishing a suite of tools—including model legislation and auditing benchmarks—designed to empower local communities with the scientific evidence they need to navigate the rapidly evolving trajectory of AI infrastructure.


Interested in subscribing to receive LCSSP's latest updates?

If you are external to Caltech, fill out this form.

If you are on the Caltech network, use these links to subscribe: LCSSP Mailing List, LCSSP BioPolicy Initiative, and LCSSP Democracy Mailing List.

For help or questions, please reach out to [email protected]!