For the last few years, NVIDIA has had an unquestioned dominance in the GPU market, but competitors are slowly beginning to emerge.
Amazon is offering customers its Trainium AI chips at a discount to challege NVIDIA, The Information reports. AWS reportedly provided a long-term customer Trainium-powered servers which would give them the same performance as NVIDIA’s H100 chips. But Amazon is offering its own chips at 25 percent of the price as NVIDIA’s H100s, while claiming similar performance.

Amazon’s Trainium is a family of AI chips purpose-built for AI training and inference, designed to deliver high performance while reducing costs. Its latest iteration, Trainium2, is specifically designed for training large language models (LLMs) and foundation models with hundreds of billions to trillion+ parameters. It can significantly reduce training time for large models. For instance, a cluster of 100,000 Trainium chips can reportedly train a 300-billion parameter AI model in weeks instead of months.
Amazon’s AWS gives customers the option of choosing which chips they’d like to use for their workflows, including NVIDIA, Intel, and AMD’s offerings. Because it already has a ready supply of customers, it can pitch its own chip to them, and can also have the leeway to offer substantial discounts. Amazon has also been looking to publicize its chips outside AWS, and has invested $110 million to support AI research at universities using Trainium chips.
Amazon isn’t the only player with its own AI training chips. Google has long had TPUs, which offer similar capabilities, but are largely used to power Google’s own infrastructure. And while competition in the GPU space could hit NVIDIA’s dominance, it will likely also lead to a price war, which could make computation cheaper than ever, and help accelerate the already-booming AI revolution.