AI is rapidly creating entire new industries, but the profits might not be distributed where most people expect.
Chamath Palihapitiya — venture capitalist, former Facebook executive, and one of Silicon Valley’s most outspoken voices — has a characteristically blunt take on the question. In a recent interview, he argued that the companies building large language models are unlikely to be the ones that ultimately cash in on them. Drawing on a striking historical analogy, he suggested the real fortunes will be made by businesses that use AI as a platform to build something entirely new — and that a critical, underappreciated ingredient will separate the winners from the rest.

“The person that invented refrigeration made some money,” Palihapitiya said, “but most of the money was made by Coca-Cola, who used refrigeration to build an empire.”
He was direct about where he sees today’s AI models fitting into that picture: “I view these large language models as refrigeration. Will there be some money made in them? I think so, but the Coca-Cola has yet to be built — and those are the companies that are really going to monetize it.”
What makes Palihapitiya’s argument particularly sharp is the technical point underneath it. He offered what he called “a basic thing about machine learning that’s worth knowing.”
“If you take 1,000 of the same inputs and give them to Facebook and Microsoft and Google and Amazon, they’ll all come up with the same machine learning model. But if you have one extra thing — one little ingredient that all of those other companies don’t have — your output can be markedly different. It’s like giving two great chefs three ingredients, but you give the third chef one extra one. That person has the ability to do something more.”
The implication is significant. In Palihapitiya’s framing, competitive advantage in the AI era won’t come from access to the models themselves — it will come from proprietary data, distribution, or domain knowledge that no one else possesses. A healthcare company with decades of patient records, a financial firm with unique transaction data, a logistics business with real-time supply chain signals — these are the kinds of “extra ingredients” that could produce something a general-purpose tech giant simply cannot replicate.
He is not alone in this view. Salesforce CEO Marc Benioff made waves late last year when he declared that LLMs are now a commodity — “commodity infrastructure you hot-swap for whoever’s cheapest and best” — a view that strongly echoes Palihapitiya’s refrigeration analogy. Meanwhile, Marc Andreessen has separately warned that creating LLMs could be a race to the bottom, pointing out that when every player has access to the same data and the same GPUs, pricing power evaporates quickly. Andrej Karpathy has gone further, comparing LLMs to public utilities like electricity — essential infrastructure that everyone depends on but that no single company can easily monetize through differentiation alone.
What Palihapitiya is pointing toward, then, is a coming phase of the AI economy in which the model layer becomes table stakes and the real competition shifts to application and data. The refrigerator is already in every kitchen. The question is who is going to figure out the Coca-Cola.