AI CEOs aren’t only competing against each other in how to poach the best talent, but they’re also competing with each other on how many GPUs they can provide them.
After Sam Altman had said that OpenAI will have 1 million GPUs by the end of the year, Elon Musk has said that xAI plans to have 50 million GPUs in 5 years. Altman made the 1 million GPU announcement before announcing a datacenter deal with Oracle.

“We have signed a deal for an additional 4.5 gigawatts of capacity with Oracle as part of Stargate,” Altman posted on X. “Easy to throw around numbers, but this is a gigantic infrastructure project,” he added. Altman also said that OpenAI was planning to expand the scope of the Stargate project beyond the $500 billion that was first announced.
Altman had yesterday said that OpenAI would have 1 million GPUs by the end of 2025. “We will cross well over 1 million GPUs brought online by the end of this year! Very proud of the team but now they better get to work figuring out how to 100x that lol,” he posted.
Elon Musk, however, didn’t seem to be want to be outdone. “The xAI goal is 50 million in units of H100 equivalent-AI compute (but much better power-efficiency) online within 5 years,” he said on X.
These are staggering numbers. For context, GPT-4 — which was released just two years ago and was state-of-the-art at the time — was trained on just 25,000 GPUs. OpenAI will have 40x that number of GPUs by the end of the year, and xAI plans to have 2000x that number in five years.
The new announcements come the same day that a WSJ report had said that OpenAI’s $500 billion Stargate project was going slower than planned, and would only have created a small datacenter by the end of the year. Earlier this year, Elon Musk had cast doubts on the $500 billion number, saying that Softbank didn’t actually have the money to go through with the commitment.
The WSJ report appears to have spurred OpenAI into announcing its Oracle deal. That in turn seems to have spurred Elon Musk to announce the 50 million GPU number that xAI aims to have in 5 years. These are pretty heady times in AI, and top CEOs are doing their best to one up each other on the number of GPUs they can provide to their teams. NVIDIA, meanwhile, is likely sitting back — and smiling at what’s going on.