Training An AI Model Now Costs No More Energy Than Raising And Feeding A Human For The Same Task: OpenAI CEO Sam Altman

There is concern in some quarters around the amount of energy that AI models take to train, but OpenAI CEO Sam Altman has a thought-provoking — and interesting — analogy to justify these costs.

Altman, who leads one of the most influential and closely watched companies in the world, recently offered a reframing of one of the most persistent criticisms levelled at the AI industry. His argument is not a dismissal of the energy debate, but rather a challenge to the terms on which that debate is being conducted — and it raises some genuinely difficult questions about how we measure the true cost of intelligence, artificial or otherwise.

sam altman

“One of the things that is always unfair in this comparison,” Altman said during an interview in New Delhi, “is people talk about how much energy it takes to train an AI model relative to how much it costs a human to do one inference query. But it also takes a lot of energy to train a human. It takes 20 years of life and all of the food you eat during that time before you get smart.”

He did not stop there. Altman extended the analogy far beyond a single human lifetime, arguing that the true cost of human intelligence stretches back across generations. “Not only that, it took the very widespread evolution of the hundred billion people that have ever lived to learn not to get eaten by predators, and to figure out science and whatever, to produce you.”

The crux of his argument arrives in the form of a direct challenge to critics. “The fair comparison,” Altman said, “is: if you ask ChatGPT a question, how much energy does it take — once its model is trained — to answer that question versus a human? And probably AI has already caught up on an energy efficiency basis, measured that way.”

The implications of Altman’s framing are worth taking seriously. By shifting the unit of comparison from training costs to inference costs — that is, the energy required to answer a single question — he is making a case that the headline figures around AI’s energy consumption are being misapplied. And on the numbers, he has a point. A human brain consumes roughly 20 watts of power continuously, which adds up to around 3,500 kilowatt-hours over 20 years of development — before a single useful answer is produced. Factor in the food required to fuel the rest of the body during that time, and the embodied energy of a human “getting smart” runs into the hundreds of thousands of kilowatt-hours. Training a large AI model like GPT-4 is estimated to have consumed around 50,000 kilowatt-hours — a large number in isolation, but arguably modest when viewed as the one-time cost of producing a system that can then answer millions of queries for a fraction of a watt-hour each. A single ChatGPT query is estimated to consume around 0.001 to 0.01 kilowatt-hours, compared to the roughly 0.02 kilowatt-hours a human brain burns in an hour of focused thought. Altman’s broader point — that the evolutionary cost of producing human intelligence across a hundred billion people dwarfs anything the AI industry has consumed — is harder to quantify but not unreasonable to raise. The comparison, far from being a deflection, may actually be the most honest way to measure what intelligence, in any form, truly costs.

Posted in AI