There has been all manner of speculation on how much electricity ChatGPT consumes, but the company has now come out with an official number — and provided some context on the electricity used by ChatGPT.
OpenAI CEO Sam Altman has said that an average ChatGPT query uses 0.34 watt-hours of electricity. “People are often curious about how much energy a ChatGPT query uses; the average query uses about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes. It also uses about 0.000085 gallons of water; roughly one fifteenth of a teaspoon,” Altman wrote in a blogpost. Altman further added that the cost of intelligence — through AI models — would eventually converge to the cost of electricity.

There had been lots of concern that the average ChatGPT query was using 10x the amount of electricity as a typical Google search. But this might not necessarily be the case. Google had revealed all the way back in 2011 that a typical Google search took 0.3 watt hours, which is roughly the same as what the average ChatGPT query takes now. It’s possible that Google systems have gotten more efficient over the years — or more energy intensive, especially because of AI overviews in search — but it does appear that the average ChatGPT query now takes only slightly more power than the average Google search did in 2011.
And ultimately, it could be more useful to look at ways to increase electricity production than to obsess over the amount of energy AI takes. AI seems to be here to stay, and it will likely be used in all aspects of work and play in the coming decades. And while companies would likely keep innovating to bring down the electricity usage — it would help keep their bills down for starters — governments and scientists need to come up with ways to make electricity abundant and cheap. Humanity is already on the AI path, and society needs to make sure that the resource powering it — electricity — doesn’t end up being a constraint.