Dive Brief:
- Limited grid power availability and rising industrial electricity rates pose serious challenges for large-scale data center development in key markets, JLL said in a Feb. 26 report.
- More data center developers and end users — along with managers of other energy-intensive facilities, such as manufacturing plants and EV charging hubs — are turning to flexible onsite power solutions, according to the report. Josephine Tucker, JLL’s head of energy advisory and sustainability in the Americas, said building operators in other sectors like manufacturing and healthcare are paying premiums of nearly 50% for sites with adequate power.
- Rising demand for power and increasingly intense computing workloads will push data centers to adopt modular cooling solutions that can be deployed quickly while using power and water more efficiently, Nautilus Data Technologies CEO Rob Pfleging said in an interview with Facilities Dive.
Dive Insight:
The latest research from JLL, a global commercial real estate investment and advisory firm, builds on a January report that forecast a 7% average annual increase in data center rents through 2030. One of the factors driving the projected rise is limited power availability, which can delay or derail development in otherwise suitable locations.
According to JLL’s Feb. 26 report, the wait for new large-scale data centers to connect to the power grid is approaching five years in major data center markets. Meanwhile, industrial power prices in major world economies rose 18% from 2019 to 2024, nearly five times the pace of the previous five-year period.
“Energy is no longer a background operating cost. Power availability, reliability and costs are increasingly shaping site selection, development feasibility and asset performance,” Paulina Torres, JLL’s global research director for sustainability, said in a statement.
Big tech companies like Amazon and Google, and specialized data center developer-operators like Switch and Equinix, are moving toward a “bring your own power” model that involves deploying onsite generators and batteries in self-contained microgrids until public grid connections become available. But this approach raises the already high costs of building new data centers. On Friday, Bloomberg News reported Oracle and OpenAI said they would cease work on their portion of the high-profile Stargate computing cluster near Abilene, Texas, underscoring the financial and logistical challenges of hyperscale data center development.
The present environment favors data centers that adopt efficient, modular cooling technologies, Pfleging said. The company pivoted from developing and operating its own data centers — it built what is still one of the world’s only floating data centers — to selling cooling solutions to its former competitors before the AI boom really took off.
“People were laughing at us four years ago” for engineering direct-to-chip cooling solutions for high-density server racks, Pfleging said. Now, liquid cooling is the norm for AI and high-performance computing facilities.
“It’s great that the industry is kind of aligning with us now — we feel vindicated to some extent,” Pfleging said.
Today, Nautilus markets modular cooling designs that it says can reduce the share of power data centers use for cooling and other non-computing activities to 15% or less, a 50% improvement over historical energy usage rates. Its EcoCore systems are around 70% factory-built and 30% site-fitted, making them easier to deploy than traditional “modular” systems that are about half factory-built and half site-built. They can cool racks with over 100 kilowatts of computing load, in line with NVIDIA’s latest AI chip generations.
Some analysts expect these increasingly powerful chips to live primarily on massive data center campuses with combined computing loads exceeding 250 megawatts. A Bain report published in October predicted 50 data centers of this scale would come online between 2025 and 2030, bringing the global total to 60.
In a blog post last month, Nautilus predicted that while large computing clusters will play an important role in the future, more flexible data centers will power much of the next phase of AI deployment. Customer demand for “inference” — actual use of AI tools, rather than model training — may drive this shift, which Pfleging said would result in a greater number of relatively small data centers being deployed closer to users.
“I think it has to happen [because] our appetite for technology consumption is voracious,” he said. Then again, “we were talking about ‘edge’ 10 years ago and I really thought it was going to happen,” he added.