The AI Race Is Pressuring Utilities to Squeeze More From Europe’s Power Grids

0
19

Save StorySave this storySave StorySave this story

European countries are racing to bring new data centers online as AI labs across the globe continue to demand more compute. The primary limiting factor is energy—and specifically, the ability to move it.

Though Europe is on track to generate enough energy, utilities experts say, grid operators broadly lack the infrastructure needed to transport it to where it needs to go. That’s throttling grid capacity and, by extension, the number of new power-hungry data centers that can connect without risking blackouts.

National Grid, which operates the transmission network in England and Wales, says that proposed data centers representing more than 30 gigawatts (GW) of power demand are awaiting connection to its grid, equal to two thirds the peak demand of Great Britain. Even accounting for the likelihood that some of those data centers will never be built, there is currently not enough room to accommodate them.

The wait for permission to plug in is causing some data center projects to collapse, undermining European ambitions to capture a share of the hundreds of billions of dollars AI labs are spending on compute. “Across Europe, projects are being cancelled because there’s no access to the grid,” claims Taco Engelaar, managing director at grid optimization company Neara.

Under pressure from government to clear the blockage, grid operators are experimenting with ways of eking additional capacity out of their existing networks—from switching the metals used in power lines, to bypassing areas of congestion, to dialing the amount of energy moving across lines up and down based on changes in weather conditions.

“There’s no one simple solution,” says Steve Smith, President at National Grid Partners, the venture capital division of National Grid. “What you have to do is a lot of everything.”

The queue of data centers waiting to join the UK grid began to swell rapidly toward the end of 2024, around the time the government designated them “critical national infrastructure.” Since then, connection applications have “far exceeded even the most ambitious forecasts,” according to UK energy regulator Ofgem, and the queue has tripled in size. “We knew we had this new wave of demand coming from electrification of transport and heat,” says Smith. “Now we’ve got AI on top.”

One obvious solution is to build new power lines, but that’s both expensive and slow. Depending on the scale of a development, it can take anywhere from seven to fourteen years to build new transmission infrastructure, accounting for potential planning issues, legal objections, supply chain and labor bottlenecks, and construction. “It takes time to put the stuff in the ground, connect it up, get the linesmen up there to do all that work,” says Jack Presley Abbott, deputy director for strategic planning and connections at Ofgem.

The particular geography of the UK poses further problems. A large proportion of the UK’s renewable energy is generated in Scotland and North England, whereas energy consumption—including by data centers—is concentrated at the opposite, more populous end of the country. Meanwhile, difficult terrain on the UK’s western flank means transmission lines have to be corridored down the east of the country’s landmass or offshore, limiting the options for network expansion.

Against that backdrop, National Grid is experimenting with technologies that can be applied after-the-fact to squeeze more capacity out of the grid and potentially allow more data centers to connect. “Large customers willing to pay to use your network are fantastic. The trick is, can you find ways of connecting them where you don’t have to build huge amounts of new infrastructure?” says Smith.

One of the most promising options, according to National Grid, is a sensor-based system called dynamic line rating (DLR), whereby the amount of energy directed down a power line is tuned to local weather conditions. When more energy—and therefore heat—passes through a line, it sags lower to the ground, potentially creating a hazard. But on a cold and windy day, the environmental cooling effect allows for more energy to safely pass through, increasing the line’s capacity.

“A lot of operators make very cautious assumptions about the flow that could go through these lines. We think that around three quarters of the UK network is capable of transporting more energy,” claims Engelaar. “A relatively small increase in the amount of heat running through a line translates into a large increase in energy throughput—it’s non-linear.”

An EU study found that, by applying “grid-enhancing technologies” like DLR, operators could increase overall network capacity by as much as 40 percent, in theory clearing room for data centers and other large sources of power demand to connect.

However, while it plans to deploy DLR to many of its busiest circuits within the next two years, National Grid has so far applied the technology to only 275 km of lines. “We’d love to be able to move fast and break things, but when we do that, the lights go out,” says Smith.

Equally, at times when data centers require the largest amount of energy to cool hardware—during heatwaves, say—it may not necessarily be safe to run more energy across the grid. “It’s kind of the opposite of what you want,” says Keith Bell, a professor of electrical engineering at the University of Strathclyde and co-director of the UK Energy Research Center. “Their demand is going to be higher on a hot day, but your network capacity is lower.”

National Grid aims to temper that effect by pairing DLR with technologies that allow for energy to be diverted around congested circuits and others that let data centers flex their consumption with peaks and troughs in countrywide demand, reducing energy usage or diverting to on-site batteries when the grid is under strain. The requirement for traditional data centers to supply uninterrupted compute means they have previously been treated by grid operators as uncompromising sources of power demand. But AI data centers whose workloads—though extremely energy-intensive—are more intermittent, may be able to adjust without disrupting essential tasks, trial data suggests. “The big unlock for AI data centers is flexibility,” claims Smith. “If a hyperscale data center can provide flexibility in the periods we need it…[it’ll] get connected faster.”

Identifying how soon grid-enhancing technologies will yield material capacity increases and how much extra juice they will squeeze out of the network is an imprecise science. National Grid estimates that, in the last five years, it has expanded network capacity by 16GW through a combination of grid-enhancing technologies and replacing old lines with new, more conductive ones.

The idea that this collection of technologies could act as a stopgap, allowing more sources of demand to connect while additional transmission infrastructure is developed, is also undermined by the fact that present rules disqualify National Grid from factoring data center flexibility into grid connection planning decisions.

Much of the network capacity growth required to accommodate AI data centers t will ultimately need to come from newly-built infrastructure. “Our plan over the next five years to double the amount of energy that flows over the network requires intervention—it requires build of overhead lines,” says David Adkins, head of network architecture and innovation at National Grid. “We are going to need to build more physical infrastructure.”

Meanwhile, to try to cut the bloated grid connection queue, Ofgem is preparing reforms designed to help sift the most viable and serious proposals from more opportunistic ones put forward by developers speculating on AI. The regulator has also threatened financial penalties for grid operators that fail to increase their network capacity and meet connection deadlines.

“Getting connected sooner. That’s the name of the game, right?” says Presley Abbott. “We need to connect these data centers as soon as possible to get that advantage.”