We did the math on AI’s energy footprint. Here’s the story you haven’t heard.
Nick Little

We did the math on AI’s energy footprint. Here’s the story you haven’t heard.

AI is the hottest technology of our time. Still, so much about it, including its energy use and the resulting potential climate impact, remains unknown. Leading AI companies keep exact figures about the technology’s energy consumption closely guarded. But we did the math to figure it out.

⚡ What's Next in Tech Readers: Claim your special, FREE 30-day trial subscription today.

It’s well documented that AI is a power-hungry technology. But there has been far less reporting on the extent of that hunger, how much its appetite is set to grow in the coming years, where that power will come from, and who will pay for it. For the past six months, MIT Technology Review’s team of reporters and editors have worked to answer those questions. The result is an unprecedented look at the state of AI’s energy and resource usage, where it is now, where it is headed in the years to come, and why we have to get it right. 

At the centerpiece of this package is an entirely novel line of reporting into the demands of inference—the way human beings interact with AI when we make text queries or ask AI to come up with new images or create videos. Experts say inference is set to eclipse the already massive amount of energy required to train new AI models. We were so startled by what we learned reporting this story that we also put together a brief on everything you need to know about estimating AI’s energy and emissions burden. 

And then we went out into the world to see the effects of this hunger. We take you into the deserts of Nevada, where data centers in an industrial park the size of Detroit demand ever more water to keep their processors cool and running. In Louisiana, where Meta plans its largest-ever data center, we expose the dirty secret that will fuel its AI ambitions—along with those of many others. Separately, we have a look at why the clean energy promise of powering AI data centers with nuclear energy will long remain elusive. 

Finally, we also look at the reasons to be optimistic, and examine why future AI systems could be far less energy intensive than today’s.

You can explore every element of this first-of-its-kind package here. We hope it helps you understand the growing complexity of our shared future.

To keep up with more of our climate reporting, sign up for our free weekly newsletter, The Spark.

Image: Nick Little

Njualem Amos (ATM)

Digital Marketer | Startup Growth Specialist | Creative Strategist

2mo

This is Great reporting. With Agentics stepping into play, I wonder what the world will be turning into as science friction slowly becomes the new reality.

Like
Reply
Rajarshi Saha

Driving operations and B2B growth in HealthTech & LegalTech 🌍 Bridging startup partnerships in Portugal with the world. Coordinating to build startup ecosystem partnership between India and Portugal.

2mo

Important conversation. AI’s potential is massive, but so is its energy footprint. Transparency must scale with innovation.

Like
Reply

I've designed a Universal Node Linguistics System that can drastically reduce AI training. I share this in the hopes it helps reduce training exhaust. Hoping developers will see this, and hoping you can help spread the word. Materials can be found here and paper found under my profile: Kunferman, C. R. (2025). Universal Word Parsing Node Linguistic System (Complete) (Version 3). Journal Of Experiential Research. https://doi.org/10.5281/zenodo.15399509

Like
Reply

To view or add a comment, sign in

Others also viewed

Explore topics