NVIDIA Corporation (NVDA): A Bull Case Theory

We came across a bullish thesis on NVIDIA Corporation (NVDA) on UncoverAlpha’s Substack by Rihard Jarc. In this article, we will summarize the bulls’ thesis on NVDA. NVIDIA Corporation (NVDA)’s share was trading at $136.92 as of Nov 26th. NVDA’s trailing and forward P/E were 54.05 and 32.57 respectively according to Yahoo Finance.

An exterior view of a busy data centre, showing the scale and workflows of the company.

Nvidia’s recent earnings report highlights the company’s remarkable growth, with a total revenue of $35.1 billion, an increase of 17% sequentially and 94% year-over-year. This performance surpassed the $32.5 billion forecast, driven by robust growth across all segments, particularly in the Data Center division, which saw a staggering 112% year-over-year increase, reaching $30.8 billion in revenue. This surge was primarily fueled by the success of Nvidia’s Hopper and H200 chips, with H200 sales contributing billions. Net income followed suit, rising 109% to $19.3 billion, further solidifying Nvidia’s financial dominance. Despite these impressive results, concerns linger regarding Nvidia’s dependency on Cloud Service Providers (CSPs) like Amazon, Microsoft, Google, and Oracle. Half of the company’s Data Center sales were tied to CSPs, a figure that has remained stable across recent quarters. This heavy reliance on CSPs raises questions about the sustainability of Nvidia’s growth, particularly as these providers face capital expenditure (CapEx) limits that could constrain future expansion.

While Nvidia’s Data Center segment continues to perform well, especially with its foray into sovereign AI and consumer internet companies, its increasing dependency on CSPs presents challenges. These CSPs are nearing their CapEx limits, leading to concerns about their ability to continue ramping up expenditures on AI infrastructure. Moreover, as CSPs’ Free Cash Flow (FCF) aligns closely with their CapEx, the balance between AI workload growth and investors’ return expectations will become a tightrope for these companies to walk. On a slightly negative note, Nvidia reported a sequential dip in networking revenue, despite a strong push to offer a comprehensive full-stack solution that includes GPUs, CPUs, and networking for supercomputers. Although networking revenue showed a 20% year-over-year increase, the sequential dip raised doubts about the segment’s growth trajectory. Nvidia addressed this by suggesting that networking revenue will likely see sequential growth in the next quarter as newer systems, including Blackwell chips, begin to ramp up. However, this uncertainty adds an element of caution to Nvidia’s broader performance outlook.

Another area of concern is Nvidia’s gross margin, which dipped slightly to 74.6%, a decrease from the previous quarter, attributed to the shift towards more complex and higher-cost systems. Nvidia anticipates that margins will continue to moderate into the low 70s as Blackwell chips are fully deployed by the second half of 2025. This moderation raises questions about the company’s ability to maintain its premium pricing, especially as older chips like the H100, once highly sought after, experience rapid depreciation. As newer, more cost-efficient chips such as the H200 and Blackwell come to market, Nvidia will need to navigate increasing competition and manage customer expectations regarding pricing and performance.

An interesting development in Nvidia’s strategy is the shift from focusing solely on pre-training for Large Language Models (LLMs) to a broader emphasis on post-training and inference scaling. This change suggests that the incremental performance gains in LLMs from pre-training may be diminishing, prompting Nvidia to explore new methods to scale AI models. This shift could present new challenges, especially in the use of synthetic data, which some experts question as a sufficient driver of further AI advancements. Despite these hurdles, Nvidia’s innovations in industrial AI, particularly robotics, are gaining momentum. The company’s NeMo and Omniverse platforms are already enabling large manufacturers to integrate AI into industrial robotics. CEO Jensen Huang noted that the “age of robotics is coming,” signaling a significant growth opportunity for Nvidia in this rapidly expanding market. Additionally, Nvidia is advancing multimodal AI, focusing on processing video, image, and sensory data, which could unlock new potential for AI applications.

Looking to the future, Nvidia’s addressable market is vast, with the transition from traditional CPU-based computing to AI-driven machine learning workloads poised to accelerate. The company’s GPUs will be critical in powering the next generation of data centers as they evolve to support AI models and training. By 2030, the global data center market is expected to exceed several trillion dollars, with machine learning and AI driving much of the growth. Nvidia’s role in this transformation is central, and its leadership in GPUs positions the company to capture a significant share of this expanding market. With a projected annual growth rate of 59%, Nvidia could generate $1.65 trillion in revenue from the data center market over the next four years. By 2028, the company’s total revenue could reach $600 billion, with net income hitting $300 billion, assuming it maintains its current net margins. However, the shift towards post-training and inference scaling introduces uncertainties, particularly in the pricing-sensitive inference market. Despite these challenges, Nvidia’s strong financial position and strategic focus on AI, robotics, and multimodal capabilities suggest that the company is well-equipped to maintain its leadership in AI infrastructure.

In summary, Nvidia’s earnings report underscores its strong financial performance and leadership in AI and data centers, but it also highlights growing concerns about its dependence on CSPs, the potential slowdown in pre-training for LLMs, and competitive pressures. While Nvidia is well-positioned for long-term growth, particularly with its expansion into industrial AI and multimodal capabilities, the company will need to navigate challenges in maintaining high margins and managing its pricing strategy in the evolving market for AI chips. The company’s valuation, currently at $3.6 trillion, reflects high expectations, and future growth will depend on Nvidia’s ability to sustain its momentum in a rapidly changing competitive landscape.

NVIDIA Corporation (NVDA) is on our list of the 31 Most Popular Stocks Among Hedge Funds. As per our database, 193 hedge fund portfolios held NVDA at the end of the third quarter which was 179 in the previous quarter. While we acknowledge the risk and potential of NVDA as an investment, our conviction lies in the belief that some AI stocks hold greater promise for delivering higher returns, and doing so within a shorter timeframe. If you are looking for an AI stock that is more promising than NVDA but that trades at less than 5 times its earnings, check out our report about the cheapest AI stock.

READ NEXT: 8 Best Wide Moat Stocks to Buy Now and 30 Most Important AI Stocks According to BlackRock.

Disclosure: None. This article was originally published at Insider Monkey.