AI Needs Power. Scaling It Without Raising Ratepayer Costs Requires a New Power Strategy.

[datacenter_tag_image]



Joel Yu

Senior Vice President, Policy & External Affairs

As AI data centers scale rapidly, power strategy is becoming a core infrastructure decision, enabling faster deployment while protecting ratepayers and strengthening the energy system.

Recent policy developments in Washington have underscored a growing reality: the race to build AI infrastructure is directly tied to the challenges associated with powering it. Federal regulators are working to establish new rules for integrating massive electricity loads with co-located generation, while the White House has called on technology companies to ensure the electricity required for AI infrastructure does not raise costs for American households.

Across Washington and state capitals, policymakers are increasingly focused on protecting ratepayers without halting progress. What was once viewed as a technical infrastructure challenge has become a matter of national and economic security. AI is the defining technology race of this decade, and its biggest constraint isn’t land, capital, or talent — it’s electricity, and how quickly developers can secure it.

Grid interconnection timelines are stretching well beyond construction schedules across much of the country. Utilities must study the impact of new large loads and plan upgrades to substations, transmission lines, and generation resources before new facilities can be energized — creating a growing bottleneck for hyperscale developers racing to deploy AI capacity to support growing training and inference workloads.

That reality is driving an intensifying debate about how AI infrastructure should be powered, with some proposing fully self-powered data centers disconnected from the grid and others focusing on expanding grid capacity. In practice, the most effective approach lies between these extremes: data centers bring their own flexible onsite generation while remaining connected to the grid. This model has the potential to accelerate the deployment of AI infrastructure, support grid reliability and affordability, and ensure that the rapid growth of computing capacity strengthens — rather than strains — the broader energy system.

Affordability Meets System Efficiency

Community concerns about the cost of powering large-scale AI infrastructure are at the center of today’s policy debate for good reason. Hyperscale data centers often require hundreds of megawatts of electricity, and meeting that demand requires significant investment in generation, transmission, and distribution infrastructure. Without the right structures in place, those costs can be passed on to the broader customer base, prompting responses like the White House’s Ratepayer Protection Pledge and similar regulatory efforts.

But data centers can also be part of the solution. When paired with co-located dispatchable generation, new data center development can bring additional capacity onto the grid while reducing the need for utility-funded infrastructure. Because these assets are built to serve the data center and support grid operations, private investment helps reduce or defer the amount of new generation and grid infrastructure utilities would otherwise need to build –– effectively sharing the burden of system expansion.

This approach also improves system efficiency. Generation located close to the load reduces reliance on long-distance transmission and improves overall system efficiency. Rather than adding demand to the grid, co-located resources can be structured to contribute capacity to the system where it is needed.

Importantly, EPRI and others have noted that connecting large high load factor customers can improve utilization of existing grid infrastructure and spread fixed system costs over more kilowatt-hours, which helps lower per-customer costs. In that context, the growth of AI infrastructure –– paired with thoughtful power strategies –– has the potential not only to avoid rising bills, but to improve energy affordability more broadly for surrounding communities.

A Smarter Power Architecture for AI Infrastructure

While co-locating flexible generation with large loads improves system efficiency and cost allocation, the broader value of a grid-integrated approach is that it gives data centers access to a large-scale, diversified power supply and the ability to coordinate with the grid to maintain reliability. The grid remains the primary source of electricity for the facility, providing scale and efficiency, while onsite generation operates as a flexible resource to adjust demand or deploy onsite resources during grid stress. This allows large loads to operate as active participants in the power system.  As the grid continues to add lower emission generation sources, this approach also enables data centers to draw from an increasingly cleaner power supply over time.

Onsite generation powered by natural gas provides dispatchable power that can run continuously and draw on an extensive underground pipeline network; however, continued investment in gas infrastructure and coordination with the electric system will be important as demand grows.

Maintaining a grid connection is also essential to utilities’ long-term system planning and data centers’ sustainability goals. As utilities expand generation and transmission capacity and bring more renewable resources online, grid-connected data centers can integrate seamlessly into that evolving system, rather than operating as isolated assets. This approach ensures that new infrastructure investments support both current and future demand in a coordinated way and enables more efficient integration of additional renewable generation.

In this framework, onsite generation is not a replacement for the grid — it is a complementary resource. By combining the system-wide capacity and operational efficiency of the grid with the responsiveness of onsite power, developers can build energy systems that are more flexible, more resilient, cleaner, and better aligned with the needs of a rapidly growing AI economy.

Solving the Speed-to-Power Challenge

Interconnection timelines are now one of the primary constraints on deploying new AI infrastructure. For the data center sector, which is defined by rapid innovation cycles, waiting several years for full grid connection can slow the deployment of new computing infrastructure and slow the pace at which new AI capacity comes online — or worse, push future AI investment toward regions where power can be secured more quickly.

Flexible onsite generation offers a practical way to bridge that gap. Rather than waiting for full interconnection before energizing a facility, developers can deploy onsite generation systems as prime power, allowing portions of a campus to begin operating while utilities complete longer-term grid upgrades. Incorporating flexible onsite capacity can also help streamline the interconnection process by enabling utilities to take a more phased approach to bringing large new loads onto the system, making projects easier to accommodate. Once grid service is available, those systems transition into a reliability and grid support role.

By enabling earlier deployment while maintaining long-term integration with the grid, this approach addresses both speed and cost. When data centers bring flexible power resources with them, they assume more of the risks associated with integrating large new loads, reducing pressure on the system through operational flexibility, and helping ensure new infrastructure does not shift costs onto existing ratepayers. In doing so, they contribute additional capacity and private investment that can help reduce the risk of over-building system-wide infrastructure. This model aligns closely with emerging policy priorities aimed at ensuring large new loads can be integrated without increasing costs for surrounding communities.

A Grid-Integrated Model in Practice

The grid-integrated approach is already being deployed in practice.

In Northern California, ERock is currently constructing a data center power system supporting a new Microsoft facility that pairs grid electricity with flexible onsite generation. The regional grid remains the primary source of power for the campus and, because California’s electricity system is increasingly supplied by low to zero emission resources, this allows the facility to benefit from the state’s growing portfolio of renewable energy while supporting cost outcomes that are consistent with broader ratepayer protection goals.

Alongside grid connection, ERock is deploying highly efficient onsite natural gas generation offset by renewable natural gas supply. The system is also California Air Resources Board Distributed Generation (CARB DG) compliant, meeting the nation’s strictest emission standards for reciprocating engines and aligning with California’s most rigorous air quality requirements.

The onsite dispatchable generation provided by ERock ’s ultra-low emission generators not only protects the data center but also helps meet resource adequacy needs in the state by allowing the load to come off the grid when called by the CAISO. In this way, the project provides incremental resource adequacy capacity using assets that are already deployed for data center operations –– with the potential to deliver that capacity more cost-effectively than traditional standalone generation resources. Early comparisons suggest that co-located assets may provide this capacity at a lower cost than new standalone resources, reflecting the efficiency of leveraging existing infrastructure to serve the underlying load.

The result is a model that aligns the needs of developers, utilities, and communities: accelerating the deployment of AI infrastructure while supporting grid reliability, protecting ratepayers, and strengthening the power system that the AI economy depends on.

Power Strategy Will Reshape the AI Race

In the global race to lead the AI economy, compute matters — but the power infrastructure behind it is just as critical. The projects that succeed will be the ones that treat power strategy as core infrastructure from the outset: pairing grid electricity with flexible onsite generation to accelerate deployment, aligning with affordability-focused policy goals, and ensuring new demand is integrated in a way that strengthens the broader energy system.

The future of AI leadership will depend not only on who builds the most powerful models, but on who builds the energy systems capable of supporting them at scale.

This article was originally published on LinkedIn.

 

View the Original Article

Subscribe for industry news and updates

By subscribing you agree to with our Privacy Policy