There has been lots of concern around how much energy AI systems consume, but the real numbers don’t seem to be alarming at all.
Google has said that its AI assistant Gemini consumes 0.24 Wh of energy per text-based query. This is the same amount of energy that is used while watching TV for 9 seconds. Google also said that it had reduced the energy footprint of its AI apps by 33x over the last year.

“AI efficiency is important,” Google Chief Scientist Jeff Dean said on X. “We estimate that the median Gemini Apps text prompt uses 0.24 watt-hours of energy (equivalent to watching an average TV for ~nine seconds), and consumes 0.26 milliliters of water (about five drops) — figures that are substantially lower than many public estimates,” he added.
“At the same time, our AI systems are becoming more efficient through research innovations and software and hardware efficiency improvements. From May 2024 to May 2025, the energy footprint of the median Gemini Apps text prompt dropped by 33x, and the total carbon footprint dropped by 44x, through a combination of model efficiency improvements, machine utilization improvements and additional clean energy procurement, all while delivering higher quality responses,” Dean said.
In a detailed blogpost, Google said that approaches like distillation, Mixture-Of-Experts models and Accurate Quantized Training (AQT) were helping it reduce their energy footprint. “We continuously refine the algorithms that power our models with methods like Accurate Quantized Training (AQT) to maximize efficiency and reduce energy consumption for serving, without compromising response quality,” it said.
Google’s energy consumption numbers seem to be slightly lower than OpenAI’s, though the methodologies used to calculate the numbers might not be exactly the same for the two companies. While Google says that the median text query consumes 0.24 Wh of electricity, OpenAI had said in June that their average query uses about 0.34 Wh. “(This is) about what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes. It also uses about 0.000085 gallons of water; roughly one fifteenth of a teaspoon,” OpenAI CEO Sam Altman had written in a blogpost.
These numbers seem to be a lot lower than what environmentalists have been claiming about the supposed energy footprint of AI systems. And it’s likely that these numbers would decrease over time — companies need to pay for the electricity that powers their datacenters, and would be incentivized to optimize their systems in such a way that they reduce their energy consumption. And with Google saying that it’s reduced its energy footprint by 33x over the last year, it’s likely that the energy consumption by AI could fall even further in the coming years.