Nvidia is going on the defensive as the list of contenders grows

There are signs that Nvidia is adopting a defensive posture as competitors try to carve out a piece of the AI computing market it has dominated thus far.

Wall Street is wondering when competitors will make a dent. Amazon is trying with little success so far. Microsoft, Meta, and Google all have AI chips in the works, if not in use. But none are as widely used as Nvidia’s graphics processing units and accompanying software.

A VC investor told usr that even trying to compete with Nvidia would be “a very stupid investment.” Still, a slew of startups are wading in. Most are going after a small part of Nvidia’s software platform CUDA, or tweaking the Nvidia baseline for various capabilities to match particular types of computing workloads. Large language model customers might want more memory, for example.

Competitors are more likely to be working on AI inference too, answering queries or drawing conclusions based on previous training and monitoring the results, rather than the more expensive and intensive training element of developing AI tools, at which Nvidia reportedly excels.

Morgan Stanley analysts wrote Monday that Nvidia is particularly concerned about Advanced Micro Devices (AMD), Nvidia’s Santa Clara, California, neighbor, which has Nvidia CEO Jensen Huang’s cousin Lisa Su at the helm.

“Our checks would indicate that Nvidia is clearly focused on defending market share, accelerating their roadmap cadence and becoming more aggressive with pricing. We think that AMD is viewed as particularly important as a threat,” the analysts wrote.

As the supply of Nvidia’s hopper chips increases, competitors advance, and as power supply becomes a limiting force on GPU usage and overall AI computing power in the near term, Nvidia faces new market friction, according to Morgan Stanley.

Nvidia declined to comment on the Morgan Stanley analysis.

The revenue comparison between Nvidia and AMD is still sobering. Nvidia generated $26 billion in revenue in the first quarter of 2024 compared to AMD’s $5.5 billion. But AMD’s ability to work in many clouds marks it as a competitive concern.

AMD, like Nvidia, is cloud agnostic whereas many of the GPUs in development from cloud providers, like Amazon or Microsoft, are not and are only compatible with in-house cloud services.

What does going on the defensive look like for a company with between 70% and 90% market share? It could mean becoming more price-conscious. With gross margins over 70%, Nvidia has wiggle room to lower prices, though costs may soon increase as Nvidia’s largest supplier, Taiwan Semiconductor Manufacturing Corp., has indicated it might soon increase manufacturing costs.

The promised yearly cadence of new products launched is another defensive tactic Nvidia is employing.

“Our sense is that investor expectations still haven’t adjusted enough to the impacts the Blackwell launch will have on Nvidia competitors,” the analysts wrote.

Blackwell is Nvidia’s next chip launch, in production now and shipping out later this year, according to Huang. The fast pace of new launches makes meaningful competition even harder to fathom according to Morgan Stanley.

Other players to watch
AMD may be the company Nvidia is keeping the closest eye on, but the market’s AI computing needs and possibilities go beyond GPUs.

Though not an apples-to-apples competitor, Broadcom is also a player receiving attention for its AI-enabling products. Broadcom doesn’t supply the coveted GPUs, but it does supply networking components that make the high volume, high-speed movement of data between processors possible — which then enables the massive amount of parallel computing needed for AI.


This capability is exactly why Nvidia acquired Mellanox in 2019, a crucial move to realize the AI data center at the heart of Nvidia’s rise.

Morgan Stanley assessed that Broadcom is poised to gain market share while AMD’s fate is less certain.

AMD did not respond to a request for comment in time for publication.

Similar Posts

Leave a Reply