George Sakellaris is founder and CEO of Ameresco.
As global leaders gathered in Davos, the story that dominated the World Economic Forum was familiar: artificial intelligence is the next great engine of productivity, competitiveness, and economic growth. “Industries in the Intelligent Age” was one of the key themes this year.
But we are not paying enough attention to the less glamorous reality underneath it: the physical infrastructure required to make AI work. We can build the world’s best models, write the world’s best code and design the world’s fastest chips. But none of it matters if we cannot power the data centers. And today, the electrical grid cannot scale fast enough to support the pace of AI-driven demand without consequences for reliability and affordability.
U.S. data center grid power demand was forecasted to rise 22% in 2025 and nearly triple by 2030, driven largely by AI workloads, straining existing grid capacity. In fact, America’s largest power-grid operator, PJM Interconnection, is being pushed to the brink by surging demand from data centers, with generation capacity at risk of maxing out during extreme weather and consumer anger rising as rates increase. The next question is what we do about it.
If we handle the AI boom the wrong way, we will put grid stability and public trust at odds with tech-led growth. If we handle it the right way, we can accelerate AI competitiveness without turning household electricity bills into collateral damage.
In the coming years, more large data center developers will reach the same conclusion: they cannot rely on the grid alone. They will have to bring their own power. AI workloads require enormous compute capacity. That means more servers, more cooling and far higher electricity consumption.
We are essentially building a new electricity-intensive industry in real time, yet our grid is still operating under assumptions built for a different era. Across the U.S., utilities and grid operators are warning that demand is rising faster than expected. Transmission and interconnection queues are growing. Even when generation is available, getting power to the right place at the right time has become a bottleneck.
For most Americans, resilience is something we notice only when it fails. But for a hyperscale data center, a power interruption is lost revenue, disrupted customers, damaged equipment and reputational risk. If you are investing billions of dollars in a facility that must operate around the clock, the answer is not to hope the grid catches up. The answer is to design for resilience.
This leads to an uncomfortable truth: The power system cannot meet every new load on the timeline the AI economy is demanding. This is not about blame. Utilities are working hard. Regulators are moving faster than before. But even with the best intentions, expanding and modernizing the grid takes time. Permitting, transmission planning, materials constraints and workforce availability do not disappear simply because demand is urgent.
In the meantime, placing massive new electrical loads onto an already strained system risks everyone. If large customers treat the grid as unlimited and immediate, ratepayers will feel the consequences. That is why the next phase of American AI competitiveness must include a new operating model where the grid is a partner. A data center of the future will not look like a traditional building with a utility line feeding it. It will look more like a self-contained energy ecosystem.
That model already exists in other parts of the economy. Hospitals, military bases, emergency shelters and critical government facilities do not operate under the assumption that outside power will always be available. They build redundancy. They invest in backup generation. They deploy microgrids. They plan for disruption.
Data centers should be treated the same way because increasingly they perform critical functions for society and the economy. The difference now is scale. Some of these investments will be driven by economics. Others will be driven by reliability requirements and customer expectations. In Washington, policymakers speak often about winning the AI race.
That goal requires realism, and, more importantly, energy. It is the foundation. If the U.S. fails to expand power infrastructure fast enough, we risk turning an AI boom into a self-inflicted constraint. We will delay projects. We will push investment to other regions. We will raise costs. And we will weaken resilience at the exact moment we need it most.
That means modernizing the grid. It means accelerating new generation and storage. It means reforming permitting and interconnection timelines. And it means acknowledging that large energy customers will need to invest directly in resilience.
Developers and operators are increasingly exploring behind-the-meter power solutions because they understand what is at stake. The future requires a partnership model where the grid provides connectivity and market coordination, while large customers bring distributed resiliency and flexibility.
The grid cannot do this alone. So the most reliable data centers of the future, the ones that protect uptime, control costs and support national competitiveness, will not just plug in. They will bring their own power.