Using AI to unlock the grid
Speakers from across the utility and IT sectors gathered at Harvard Climate Action Week with a clear message: The U.S. power grid must rapidly transform to keep up with growing energy demands. While AI drives much of the recent growth — electricity needs from data centers worldwide are expected to double by 2030 — AI can also offer vital tools for upgrading grid infrastructure, optimizing energy distribution, and leading the U.S. towards decarbonization.
Just a few years ago, the grid was “a fairly slow-moving, static system,” said Gordon van Welie, president and CEO of ISO New England, a nonprofit corporation that plans and operates power grids across six states. In New England, it included a few hundred large generators connected to a transmission system that supplied energy to customers. Now, however, the grid incorporates a wider variety of renewable energy sources and rapidly increasing demand, vastly changing operational dynamics.
“The exchange between the transmission and the distribution system is creating greater complexity and much more uncertainty,” van Welie said. “Once you expand your system out from a few hundred things to millions of things, there’s a massive explosion of data. That’s why I think AI is going to be fundamental for us managing the complexity of these risks.”

A tangle of considerations
Minlan Yu, the Gordon McKay Professor of Computer Science and co-leader of The Power and AI Initiative at the Harvard School of Engineering and Applied Sciences, acknowledged that data centers are scaling up at lightning speed. Just a few years ago, centers required mere megawatts of energy, but new facilities demand orders of magnitude more — some reaching as high as 10 gigawatts of electricity each, seismically shifting infrastructure and workforce demands.
Yu added that data centers can operate in ways that ease stress on the grid, for example, by strategically distributing workloads across time and geography to ensure that centers aren’t consuming loads of power while consumers are.
But questions abound about how to power AI growth sustainably. Ram Rajagopal, Associate Professor of Civil and Environmental Engineering and Electrical Engineering at Stanford University, said that his team had identified four areas of concern.
One is reliability. For instance, the large training loads that data centers process to train AI models can cause sharp power drops — sometimes several hundred megawatts in just a few seconds — creating system instabilities. Figuring out ways to smooth out these dips and ease other concerns around reliability will be crucial for building sustainable AI. Engineers, researchers, regulators, and policymakers should also focus on ways to flexibly integrate data centers into the grid, cost considerations for consumers, and the amount of water these centers use and the emissions they create.
“When I talk to some close colleagues working at places like Google, Meta, etc., they say, ‘You know, [emissions] were the number one priority three years ago. Now it’s like the third or fourth priority,’” Rajagopal said.

Judy Chang, a member of the Federal Energy Regulatory Commission, said that AI could help fast-track some aspects of grid transformation — for example, expediting lengthy studies that are required to connect new generators — “but it doesn’t solve the other problem, which is the timeline to build infrastructure on the grid is very different from the timeline for building data centers.”
Data centers often take one to three years to build, but updating grid infrastructure can take anywhere from five to 30 years. “We need to do this much more quickly, not just the studies and the upgrades and the permitting process, but also the regulatory decisions need to be made quickly, and we’re not used to doing that,” she said.
Communications and trust
The panelists agreed that tough business, operations, and engineering challenges lie ahead, and as AI seeps into new areas of modern living, the stakes rise and problems become more complex.
Yu of the Power and AI Initiative said that as AI is increasingly integrated into critical healthcare, military, and government operations, data centers will likely need to use power at peak-energy times, creating new variables when designing flexible strategies for connecting to the grid.
Panelists also agreed that powering AI will require much more communication between those from utilities, data centers, and regulatory sectors.
“It’s not just about data,” said Ram Rajagopal of Stanford University. “You have to build trust.”