Nvidia’s near-total dominance over the market for artificial intelligence chips is head-turning — so much so, it’s now set to be the subject of a Department of Justice investigation for potential anticompetitive conduct. The company is estimated to have captured as much as 90 percent of the market for high-end AI chips. Its stock is responsible for a third of the S&P 500’s gains this year. CEO Jensen Huang is being mobbed by fans to the point where the celebrity treatment has a name: “Jensanity.” Its graphics processing units, or GPUs, which have become the go-to chips for AI, are so highly sought after that startups resort to extreme lengths to access their computing power. Even the idea of taking some of Nvidia’s premier chips away from a company can invoke a scandal. It’s a remarkable illustration of how a single company can shape an entire ecosystem. And the way Nvidia’s competitors — and regulators — are reacting might shake it all up again. Nvidia’s runaway success comes not only from how good its chips are. It kept itself at the center of the generative AI wave by building a whole network of related hardware and software. Its strategy of bundling essential software with the chips has triggered strong criticism from customers and competitors, who say those sales tactics lock them into Nvidia products. What does that mean for the future of AI? Will a software-driven industry really be reshaped by the hardware that runs it? I spoke with Dylan Patel, who runs his own semiconductor research and consulting firm, about why Nvidia is so powerful and how its competitors have been taking cues to break its lead. For its actual chips, he explained, Nvidia faces solid head-to-head competition from AMD and other chipmakers, including Qualcomm, Intel and Apple. These companies are essentially chip designers, and they pretty much all use the same manufacturer to fabricate them: Taiwan’s TSMC. And Nvidia doesn’t always have the upper hand with its products: the performance of Nvidia’s versus AMD’s processors has varied with different releases, flipping back and forth. Instead, Nvidia has distinguished itself with its software library, known as CUDA. It launched in 2007 to make GPU programming easier for those already familiar with C programming, a move that paid off when deep learning started to boom years later. Its early start benefited from strong network effects. Today, over 4 million developers worldwide rely on Nvidia’s software platform for building AI and other applications. “It’s not that they are necessarily having technology that no one else has access to,” said Patel. “It’s just the fact that Nvidia has made the software. Of course, the software is only compatible with their GPUs.” Still, analysts, including Patel, have been predicting for months that Nvidia’s software gap is not as substantial as it once was. Why? Because other companies — in fact nearly all its competitors — have banded together to develop open alternatives that would break Nvidia’s lock on the software and hardware ecosystem facilitating AI. Intel, Google, Arm, Qualcomm, Samsung and other tech giants are pushing plans for a new software suite that could allow developers’ code to run on any machine with any chip. Similar efforts come from OpenAI, which has released an open-source language so researchers with no CUDA experience can write GPU code, and from the open-source PyTorch Foundation, which was incubated by Meta. Other efforts are gunning to replace Nvidia’s proprietary hardware, including by targeting its technology to connect multiple AI chips within and across servers. Again, a group of competitors including Intel, Microsoft, Meta, AMD, and Broadcom are working on a new industry standard for the linking technology that is crucial for modern data centers — which if successful, would give them power over the hardware that tech companies have to use. The collision between a proprietary and an open approach reminds many observers of how the smartphone market developed — which created winners for both visions in Apple (closed) and Google’s Android (open). While Nvidia’s Huang has said its strategy is “really not one or the other,” Patel said the ongoing debate could be a talking point to head off antitrust scrutiny. It’s not the only open-vs-closed debate in the AI world, either: An tug-of-war between “open-source” and proprietary versions of AI models themselves is dividing the industry, spurring regulatory conversations and even fueling national rivalries. As for how these arguments will ultimately shape who builds AI, and how it works — even an Nvidia chip might be hard-pressed to guess.
|