The Big Takeaway From Google’s AI Climate Report
By Justin Worland
Please take a quick survey to help us gain a better understanding of who you are and what the most interesting business and climate topics are to you!
For the climate concerned, the rise of the AI-reliant internet query is a cause for alarm. Many people have turned to ChatGPT and other services for simple questions. And even basic Google searches include an AI-derived result.
Depending on how you crunch the numbers, it’s possible to come up with a wide range for the energy usage and related climate cost of an AI query. ChatGPT provides the estimate of up to 0.34 watt-hours per prompt, equivalent to using a household lightbulb for 20 seconds, while one set of researchers concluded that some models may use as much as 100 times more for longer prompts.
On Thursday, Google released its own data: the average search using Gemini—the company’s ubiquitous AI tool—uses 0.24 watt-hours. That’s equivalent to watching about nine seconds of TV. Such a search emits 0.03 grams of carbon dioxide equivalent. Perhaps more interesting is how Google says Gemini text queries have become cleaner with time. Over the last year, energy consumption per query has decreased around 97% while carbon emissions have decreased by 98% per query, the company said. A separate report from Google released earlier in the summer showed a decoupling of data center energy consumption and the resulting emissions. (It’s worth noting, of course, that simple text queries are less intensive than other functions like image, audio, or visual generation, and these figures don’t include training of models—numbers that aren’t included in Google’s report given the challenges of accurately calculating them).
Read more: Some AI Prompts Can Cause 50 Times More CO2 Emissions Than Others
Whether such a downward trajectory can continue is a crucial question for anyone watching the future of energy and climate in the U.S.—with implications not just for the future of U.S. emissions but also for the hundreds of billions of dollars in power sector investments. Across a variety of related industries, leaders will need to try to thread the needle: addressing the growing demand for AI while avoiding overbuilding infrastructure as AI models grow more efficient.
Google’s progress boils down to two levers: cleaner power and more efficient chips and query crunching.
The clean energy strategy is impressive, but fairly straightforward. The company buys a lot of renewable energy to power its operations, signing contracts to buy 8GW of clean power last year alone. That’s equivalent to the capacity of 2,400 utility-scale wind turbines, according to Department of Energy numbers. Going forward, the company has invested in helping bring other future clean technologies like nuclear fusion online.
But then there’s the company’s efficiency measures. In energy circles, efficiency tends to refer to simply using less energy and making energy hardware run more productively—think of climate control or better insulation. While Google has done some of that, the most impressive efficiency gains have come through the AI ecosystem rather than the energy system. The company has created its own chips—which it calls TPUs, as opposed to broadly used GPUs. Those chips have become more efficient over time—some 30 times more efficient since 2018, according to Google’s sustainability report. The company has also improved the efficiency of its models using techniques that crunch queries differently, thereby reducing the needed compute power. And a few weeks ago the company announced a program to shift data center demand to times when the electricity grid is less stressed.
Read more: AI Could Reshape Everything We Know About Climate Change
The question—not just for Google but for any company deeply invested in AI—is whether those programs and the resulting efficiency gains can continue. Deepened efficiency gains would be a huge climate win—so long as the increase in usage doesn’t outpace the increase in efficiency.
Greater efficiency would also have significant implications across the energy sector. Right now, power companies are betting big on new sources of electricity generation on the assumption that AI will continue driving demand growth. But it’s very hard to predict exactly how fast demand will grow. Prospective efficiency gains are a big reason why, and Google’s results should at least make you pause and consider the known unknown potential.
To get this story in your inbox, subscribe to the TIME CO2 Leadership Report newsletter here.
Must-read stories from TIME Climate:
Integro IA, equidad de género y visión de futuro para impulsar carreras y organizaciones con propósito | IA aplicada al trabajo | Mujeres en STEM | Futuro del trabajo | Mentoría y formación en IA
4wEl informe de Google abre un punto clave: la eficiencia y las energías limpias son avances importantes, pero el gran reto es que el crecimiento del uso de la IA no supere esos logros. La sostenibilidad no se medirá solo en chips más rápidos o contratos renovables, sino en la capacidad de equilibrar innovación con responsabilidad energética.
Dr.Shadi Mahyari Zand -POST DBA GRADUATED AI REVOLUTIONARY IMPACT ON THE ORGANISATIONS PERFORMANCES,ORGANIZATION LEADERSHIP,GLOBAL COMMUNICATIONS DIRECTORY,MENTORING AND LIFE COACHING
4wVery insightful and informative. Thank you for being here and sharing.
Consultant Neonatologist - Professor of Perinatal Health FRCPCH, FRSA, FELLOW ECPD, NED Co Founder GNS SIGNEC
1mo#15Trees4MyLife Interested in knowing more?
Journalist, editor, producer, manager, and instructor with multiple-media expertise working at local, national, and international levels.
1moAlthough creating demand for new products and services is part of big tech’s financial success, the energy costs of adding A.I. to everyday activities --such as searching the Internet -- is proving unsustainable. But if Big Tech has their way, people will transition to using A.I. even more, leading to more demand for energy (including more nuclear power) and more rebranding (e.g., "Google Assistant" to "Gemini"). Already expected to come online in 2028, “Three Mile Island Nuclear Station, Unit 1” is to power data centers for Microsoft. It’s been renamed the “Christopher M. Crane Clean Energy Center.” More thoughts here: https://sustainlab.substack.com/p/nuclear-powered-search
Coach, Consultant, Director of Business Development
1moIt's distressing to know that there's such a negative impact on the environment because AI can be such a helpful tool. I feel guilty using it because I know it hurts the planet and people in vulnerable locations around the world. Clearly one solution is green energy. Unfortunately the current administration is determined to block the development of sustainable energy, as demonstrated by the recent shutdown of the almost complete wind farm in Rhode Island.