Resources

AI is Pushing Energy Grids to the Limit. What if Your Water Heater Could Help?

Written by WATTER | Sep 10, 2025 5:05:22 PM

How WATTER turns commercial water heaters into green compute infrastructure—tackling AI's energy crisis while creating new value for real estate operators.

Estimated time to read: 10 minutes

AI is reshaping every industry, and it's also overwhelming our energy grids. Businesses are already feeling the strain to deliver more AI Compute, and the surprising bottleneck isn't chip supply—it's power and heat. The 2020-2023 Global Chip shortages were a primary constraint for AI, but that has shifted rapidly: the energy required to run AI tasks is now growing at an annual rate of between 26% and 36%, and is the primary bottleneck to delivering more AI Compute.

For commercial facility leaders, this challenge is also an opportunity: what if your building's water heaters could help address it? WATTER is transforming an often overlooked corner of infrastructure into a network of intelligent, energy-efficient compute units.

In this piece, we'll explain what's happening behind the scenes of AI's growth—and how your properties can turn heat into an asset, not a liability.

AI's Real Bottleneck: Not Chips—Energy

For the past few years, headlines around artificial intelligence have focused on chip scarcity. And for good reason—surging demand for GPUs and custom accelerators left everyone from startups to hyperscalers scrambling. But as supply chains begin to recover, a deeper and more enduring constraint is becoming clear: the problem isn’t silicon. It’s energy.

Even as chip supply stabilizes, compute demand is scaling far faster than infrastructure can absorb. Model training and inference workloads aren’t just increasing—they’re multiplying exponentially. Behind every AI query lies a power draw that data centers—and by extension, the grid—must support. The physical realities of electricity consumption, heat dissipation, and cooling capacity are now the true governors of AI’s trajectory.

In fact, some new data centers are now facing multi-year delays just to secure a power hookup. Dominion Energy, for example, has reported up to a seven-year wait for new grid access in high-demand zones like Northern Virginia. At the same time, large language models can require over 1,000 megawatt-hours of power to train—roughly equivalent to the output of an entire power plant. According to a report from the International Energy Agency (IEA), one data center today consumes as much electricity as 100,000 households, but some of those currently under construction will require 20 times more, which means the equivalent of 2 million households.


IEA (2025), Energy and Al, IEA, Paris https://www.iea.org/reports/energy-and-ai, Licence: CC BY 4.0

We’re entering an era where power budgets—not chip specs—will define who can scale. Data center construction is being constrained not just by land and capital, but by access to electricity. In jurisdictions with aggressive climate targets, operators face growing pressure to decarbonize compute—or pause growth altogether.

Processing data, mainly for AI, will consume more electricity in the US alone by 2030 than manufacturing steel, cement, chemicals and all other energy-intensive goods combined, according to a report from the International Energy Agency (IEA). Fiona Harvey, Environment editor, The Guardian, 10 Apr 2025.

This shift has serious implications for industries beyond tech. Any sector that relies on real-time decisioning, automation, or on-site AI will increasingly feel the pressure. As compute moves closer to the edge—meaning processing happens locally rather than in distant data centers—energy becomes the constraint that must be innovated around.

That’s also where the opportunity lies. Just as rooftop solar and EV chargers turned passive buildings into active infrastructure, a new wave of sustainable, distributed compute could do the same for heat-intensive assets. Water heaters—long ignored—are suddenly in play. By reimagining them as energy-efficient compute nodes, commercial properties can position themselves to participate in AI’s growing compute economy, while repurposing the heat from those workloads to serve their original function: heating water.

There are more than 130 million water heaters in the U.S. alone, representing over 82 gigawatts of latent energy load already embedded in the built environment. That’s not just thermal capacity—it’s a platform for distributed AI infrastructure hiding in plain sight.

For real estate operators, this isn’t just a climate solution. It’s a way to transform a cost center into a strategic asset. As AI demand strains the grid, solutions like these provide a near-term infrastructure advantage—and a glimpse of how energy and compute might converge at scale. It's a closed loop: power in, compute out, heat reused. For real estate operators, this isn't just a climate solution. It's a competitive edge.

The Business Risk Hiding in the Energy Curve

AI is already finding its way into buildings—via smart HVAC systems, voice-controlled assistants, and a growing wave of tenant-facing automation. It's visible, helpful, and in some cases even expected. But while most operators are focused on features and convenience, few have looked behind the curtain to understand what these systems actually cost to run.

Each AI-enabled feature, especially those that remain always-on or process real-time data, adds a marginal but constant pull on the power supply. And that’s just the tip of the iceberg. Many of these tools depend on powerful data centers to process information behind the scenes—creating a ripple effect of energy demand that’s mostly invisible to building operators.

This pressure shows up in different ways depending on the building. In older properties, it might mean degraded power quality that shortens the lifespan of equipment or triggers unplanned maintenance. In newer ones, it might be escalating energy bills that chip away at margins. Either way, the convergence of AI, automation, and volatile energy pricing is shifting electricity from a predictable expense to a hidden risk.

That risk is no longer hypothetical. Grid operators across the U.S. are already feeling the strain. Data centers consume approximately 4% of U.S. electricity today, and that figure could rise to 12–15% by 2030, largely due to AI. In high-density regions, this demand is affecting more than supply—it’s affecting quality. Data center clustering has been linked to voltage distortion and wave instability that can ripple into nearby neighborhoods, disrupting power to businesses and homes.

Meanwhile, expectations are rising. ESG performance is becoming a baseline requirement for capital access and tenant retention. Buildings that over-consume or can’t demonstrate credible emissions reductions risk falling behind. According to recent surveys, up to 77% of tenants consider sustainability features when making leasing decisions. Energy blind spots that once flew under the radar now create reputational risk—and lost deals.

The business impact is clear: AI is making buildings smarter, but also hungrier. The energy curve is steepening, and for many portfolios, it’s bending toward volatility. Without a proactive strategy, operators could find themselves facing rising demands, rising costs, and rising expectations—all at once.

Yet within that curve lies opportunity. WATTER’s insight is to flip the equation. While data centers battle rising costs and cooling loads, WATTERturns electricity into compute—and then uses the resulting heat to serve a building’s existing hot water needs. It’s an inversion of the traditional model: rather than paying to eliminate waste heat, WATTER puts it to work as a value.

This approach aligns with two macro trends already reshaping energy infrastructure: distributed compute and thermal reuse. It also addresses a real economic pain point. Hot water heating is one of the largest operational expenses in many commercial buildings—representing up to 36–39% of total energy use in sectors like hospitality and multifamily housing. With WATTER, that cost doesn’t just shrink—it can become a source of monetizable compute capacity, especially for AI workloads like LLM inference and retraining.

The pressure may be rising—but for operators with the right infrastructure, so is the upside.

WATTER's Approach: Making Infrastructure Work Twice

Nearly 98% of the electricity consumed by data centers is released as low-grade heat—typically expelled through massive cooling systems that waste energy and strain local grids. At the same time, buildings account for over 30% of global energy use, with heating and hot water consuming up to half of that. In effect, two energy-hungry systems—compute and heating—are operating side by side, yet entirely disconnected.

WATTERbridges that divide. By embedding industrial compute modules inside electric water heaters, the company transforms a passive appliance into an intelligent node that handles AI workloads while heating water. It’s a deceptively simple idea with significant implications. Found in virtually every commercial and residential building, these heaters become edge compute units—processing tasks locally and reusing the resulting heat to meet an existing demand.

“People think GPUs are the problem. They’re not. It’s heat,” says James Hancock, CEO of WATTER.

By relocating compute to the edge—inside infrastructure that already exists—WATTERreduces the need for new, centralized data centers. It’s not about building more; it’s about upgrading what we’ve already built. This shift reframes traditional infrastructure as an asset capable of doing double duty: delivering both digital performance and physical utility.

WATTER’s approach taps into an overlooked reservoir of capacity. The U.S. has more than 130 million water heaters representing up to 82 GW of latent electrical load. If even a fraction of those were compute-enabled, the impact would be transformative. According to internal analysis, 10% market penetration could reduce power consumption by ~71.5 TWh/year—without adding any new energy to the grid. This insight is underscored in a client-provided slide and could serve as a compelling visual breakout if this section is repurposed in digital format.

This isn’t just a technical breakthrough—it’s an economic one. By leveraging existing assets, WATTER allows operators to rethink cost centers as revenue opportunities. Each WATTER-enabled unit can run paid AI workloads like LLM retraining or inference. The building owner participates in a compute-sharing model where workloads are distributed across the network, and revenue is shared based on use. While WATTER is still finalizing its offtake agreements, the model is designed to fairly offset utility costs and generate incremental value from idle capacity.

“If your hot water heater could generate revenue, who wouldn’t say yes to that?” — James Hancock

This decentralized approach also opens the door to fleet-wide deployment. While WATTER is currently best suited for pilots, it is actively pursuing partnerships with operators managing multiple buildings—such as hotel groups or multifamily portfolios. In these cases, a single successful pilot could unlock portfolio-scale advantages: shared infrastructure standards, consolidated ESG reporting, and distributed compute capacity spread across dozens or hundreds of sites.

The environmental impact scales too. Imagine a 200-unit multifamily building outfitted with WATTER. It could offset tons of CO₂ per year while gaining access to local compute capabilities. These emissions reductions are traceable and verifiable via lower PUE, and the company is integrating its reporting with leading ESG frameworks.

In a market where data center expansion is constrained by permitting delays, grid limitations, and investor pressure for sustainability, WATTER offers something rare: a way to grow compute capacity without building new infrastructure or adding new energy to the grid. It’s not just clever engineering—it’s a reframing of what infrastructure can be in an AI-powered world.

What It Means for Operators: Revenue, Resilience, Reputation

For building operators facing volatile utility prices, shifting tenant expectations, and rising ESG scrutiny, WATTER offers a rare combination: a system that offsets energy spend, strengthens on-site infrastructure, and supports sustainability goals—all in a single upgrade.

The opportunity starts with fundamentals. Space heating and hot water account for 36–39% of total energy use in most U.S. commercial buildings. That’s a significant operational load—typically viewed as an unavoidable cost. WATTER reframes it as an underutilized asset. By embedding compute into systems that already consume energy, WATTER turns thermal infrastructure into a value-generating layer.

Revenue is the most tangible benefit. Each WATTER-enabled heater can run AI workloads—like large language model (LLM) inference or retraining—and earn income in return. Operators aren’t installing entirely new infrastructure; they’re upgrading what already exists. This upside is especially compelling in high-demand sectors like hospitality, multifamily housing, and campus environments, where hot water usage is constant. These buildings now have a path to convert their thermal overhead into distributed compute capacity—with monetization potential built in.

Resilience follows. Grid instability is no longer a hypothetical—it’s a constraint. Electricity prices are rising sharply: in PJM's 2025/2026 capacity auction, clearing prices surged more than 833% year-over-year. Data centers are facing power shortages and permitting delays. So instead of relying on hyperscale centers that may take years to build, operators can deploy localized nodes that deliver redundancy while serving daily needs.

Reputation is the third pillar—and one of growing urgency. Sustainability has become a filter for investor capital and tenant retention. Between 80% and 889% of institutional investors now prioritize ESG performance in their decisions, and tenants increasingly demand visible, verifiable environmental commitments. WATTER supports both: by reusing heat that would otherwise be wasted, it reduces net energy use and emissions. Its impact can be measured through lower PUE, and the company is open to aligning with ESG reporting frameworks like LEED, GRESB, and CDP.

“Making ESG actionable means rethinking what we overlook—like hot water heaters,” notes WATTER CEO James Hancock.

WATTER doesn’t require operators to reinvent their buildings. It helps them extract new value from what’s already installed. The result is smarter infrastructure—intelligent, sustainable, and multipurpose. One system that pays for itself, meets compliance goals, and helps future-proof operations in an energy-constrained AI era.

📦 Business Benefits at a Glance

– Offset utility costs with monetized compute

– Reduce net energy use through heat reuse

– Strengthen resilience via decentralized AI workloads

– Support ESG and carbon reporting with auditable metrics

– Differentiate your property with visible, embedded climate tech

 

The Infrastructure Advantage: Why Now?

For commercial operators navigating a grid-constrained, AI-driven future, infrastructure isn’t just a cost center—it’s a competitive lever. As power prices climb and permitting delays stall new capacity, the ability to embed intelligence into existing systems is emerging as a key differentiator. Using water heaters as compute infrastructure offers a way to act now, before infrastructure bottlenecks close the window.

The opportunity lies in what’s already installed. More than 130 million electric water heaters exist across the U.S., representing over 80 gigawatts of latent thermal load could perform a second function: running AI tasks that benefit from distributed deployment. The result is not speculative infrastructure, but practical repurposing—with dual benefits.

The value is clear at the portfolio level. A single building can offset emissions and monetize its compute load—but for operators managing dozens or hundreds of properties, the gains compound. Distributed across a portfolio, WATTER’s nodes offer:

  • Aggregated revenue potential from shared AI workloads
  • Carbon reductions that can be tracked across sites
  • Resilience through localized infrastructure
  • A differentiated ESG story backed by visible technology

“WATTER helps turn overlooked systems into digital assets,” says CEO James Hancock. “The best businesses solve problems people ignore—like turning water heaters into compute infrastructure.”

WATTER’s model fits into existing operations without deep structural change. Operators don’t need to build new facilities or retrain teams. The pilot-to-scale pathway begins with a typical equipment upgrade—something most properties already plan for—then expands across the fleet as value becomes clear.

This timing matters. AI’s energy appetite is rising fast, while utilities struggle to keep up. For real estate leaders, waiting means rising costs, capacity constraints, and missed opportunity. Those who move now can lock in infrastructure advantages that others will struggle to match later—especially as regulatory and investor scrutiny tightens around emissions and energy use.

And unlike many sustainability solutions, WATTER’s impact is tangible and traceable. Each deployment delivers a measurable efficiency gain and a story of digital transformation with environmental benefit.

The infrastructure is already here. What WATTER offers is a smarter way to use it—before the next wave of demand makes reactive upgrades the only option.

From startups to facilities, WATTER is turning everyday infrastructure into a new kind of cloud. Visit our website to see how you can be part of it.