Nvidia's Underrated Business Soars Like a Rocket Ship

The Hidden Power Behind Nvidia’s Data Center Dominance
When Nvidia (NVDA) releases its second-quarter earnings on August 27, investors will be closely watching the company’s data center segment. This division is where the chipmaker generates the bulk of its revenue, particularly through the sale of high-performance AI processors. However, the data center segment encompasses more than just chip sales—it also includes a critical but often underappreciated component: networking technologies.
Nvidia's networking solutions, which include NVLink, InfiniBand, and Ethernet, play a vital role in enabling communication between its chips, allowing servers to interact within massive data centers, and ensuring end users can access AI applications seamlessly. These technologies are essential for building powerful computing infrastructures that support advanced AI workloads.
“Building a supercomputer requires strong infrastructure,” said Gilad Shainer, senior vice president of networking at Nvidia. “The key is how you connect those computing engines together to form a larger unit of computing.”
In the previous fiscal year, networking sales accounted for $12.9 billion of Nvidia’s $115.1 billion in data center revenue. While this may seem modest compared to the $102.1 billion generated from chip sales, it surpasses the $11.3 billion earned by the company’s second-largest segment, Gaming. In Q1, networking contributed $4.9 billion to the $39.1 billion in data center revenue, and its growth is expected to continue as companies expand their AI capabilities.
Networking: The Unsung Hero of AI Infrastructure
Deepwater Asset Management managing partner Gene Munster described Nvidia’s networking business as “the most underappreciated part of the company’s operations.” He noted that while networking accounts for only 11% of Nvidia’s revenue, it is growing rapidly and is crucial for supporting the company’s core AI offerings.
Nvidia’s networking solutions span three primary types of networks. First is NVLink, which connects GPUs within a server or across multiple servers in a tall, cabinet-like rack. Next is InfiniBand, which links multiple server nodes across data centers to create a large-scale AI computer. Finally, there is the front-end network for storage and system management, which uses Ethernet connectivity.
These networks are not just about connecting hardware—they’re designed to enable fast, efficient communication between components. If data transfer between GPUs is slow, it can hinder overall performance and reduce the efficiency of an entire data center.
“Without networking, Nvidia would be a very different business,” Munster said. “The value that customers seek from Nvidia’s chips wouldn’t exist without its networking infrastructure.”
As AI models become more complex and autonomous systems take shape, ensuring that GPUs work in harmony becomes increasingly important. This is especially true when it comes to inferencing—the process of running AI models—which demands powerful data center systems.
The Rise of Inferencing in AI
The AI industry is currently undergoing a shift in focus toward inferencing. Initially, many believed that training AI models required massive computational power, while running them was less demanding. However, this perception has changed as companies have realized that even running AI models benefits from high-performance systems.
Earlier this year, DeepSeek claimed it trained its AI models using lower-end Nvidia chips, sparking concerns that high-powered systems might not be necessary. But as chipmakers pointed out, these models perform better when run on powerful systems, allowing them to process more information faster.
“I think there's still a misperception that inferencing is trivial and easy,” said Kevin Deierling, senior vice president of networking at Nvidia. “It turns out that as we move toward agentic workflows, inferencing is starting to look more like training. All of these networks are important, and having them tightly integrated with CPUs, GPUs, and DPUs is essential for a good user experience.”
Despite this, competition is intensifying. AMD is aiming to capture more market share, while cloud giants like Amazon, Google, and Microsoft are developing their own AI chips. Industry groups are also introducing competing networking technologies, such as UALink, which aims to rival NVLink.
However, Nvidia continues to lead the market. As tech giants, researchers, and enterprises compete for access to its chips, the company’s networking business is poised for continued growth.
Posting Komentar untuk "Nvidia's Underrated Business Soars Like a Rocket Ship"
Posting Komentar