10 Stock News You Should Pay Attention To

3. Nvidia Corp (NASDAQ:NVDA)

Number of Hedge Fund Investors: 235

Dan Niles from Niles Investment Management in a recent program on CNBC reiterated his concerns about a potential slowdown in AI demand. This time, Niles highlighted Nvidia’s data center miss in its latest quarterly report.

“Over the weekend it came out that OpenAI has sort of pushed up their revenue forecast for 2030 by about $26 billion to $200 billion,” Niles said. “But to get that revenue increase of $26 billion, they’re going to have to burn $85 billion more in cash. So the data points are definitely there, but you know, we’re in this business to make money. So you have to go, all right, well, do the fundamentals matter right now, or is it the fact you’re going to get even more easy money? The 10-year yields going down to 4%. And so this market, much like in 2021, will keep going up right up until it doesn’t. And so I think you have to look at that. And Nvidia missing on data center revenues, and you can say yes the growth was so high and all this, but they missed, so that should give you another data point. And you throw on top of that Dell or Marll or the fact that Aago’s got more ASIC revenues because companies want to cut their costs—those all speak to the same narrative, so don’t confuse the actual data with what the stock prices are doing.”

Nvidia’s Hopper Infrastructure and now Blackwell form the core of AI infrastructure for LLM training and inference. But Nvidia’s growth is slowing compared to previous quarters amid competition and capex spending limitations from major companies. In the recently reported quarter, Nvidia’s annual revenue growth came in at 56%, compared with nearly 100% YoY growth in the past.

With its strong position in the data center market and rising demand, Nvidia is likely to keep growing, though not at the same pace it has in the past. Increasing competition from major companies like Broadcom is also expected to impact Nvidia’s margins in the long term.

Loomis Sayles Growth Fund stated the following regarding NVIDIA Corporation (NASDAQ:NVDA) in its second quarter 2025 investor letter:

“NVIDIA Corporation (NASDAQ:NVDA) is the world leader in artificial intelligence (AI) computing, which enables computers to mimic human-like intelligence for problem solving and decision making capabilities. Founded in 1993 to develop faster and more-realistic graphics for PC-based video games, Nvidia created the first graphics processing unit (GPU), a dedicated semiconductor that employs a proprietary parallel processing architecture to perform superior graphics rendering outside of a computer’s standard central processing unit (CPU). The parallel processing capability of Nvidia’s GPUs, which contrasts with the linear processing requirement of CPUs, can accelerate computing functions performed by standard CPUs by greater than ten times. As a result, Nvidia extended its visual computing expertise beyond its legacy gaming market into innovative new and larger markets, including data centers, autos, and professional visualization. The parallel processing capability facilitates pattern recognition and machine learning functions that have enabled Nvidia to be at the forefront of growth in artificial intelligence applications. As a result, the data center business, which first surpassed the gaming business to become Nvidia’s largest revenue and profit generator in its 2023 fiscal year, grew to represent over 88% of revenue in the company’s most recent fiscal year. The company is also focused on building out its GPU-computing-based ecosystem and is helping to enable breakthroughs in autonomous driving, and virtual reality.

A fund holding since the first quarter of 2019, Nvidia reported very strong quarterly financial results that reflected the company’s dominance in capturing spending on AI computing within data centers. For the quarter, total revenue of $44.1 billion grew 69% year over year and 12% versus the prior quarter, despite new U.S. Government restrictions on the sale of its H20 chips to China that resulted in $2.5 billion of foregone revenues in the period. Nvidia’s H20 chips were specifically designed to comply with prior U.S. export restrictions, and the company anticipates a further $8 billion of foregone sales in the current quarter due to the restrictions. Despite the revenue headwind, the company expects revenue of approximately $45 billion in the current quarter, which would represent 50% growth over the prior-year quarter. The results were also notable due to recent concerns that spending might slow given potentially cheaper options to develop AI functionality. These concerns were catalyzed by the January 2025 launch of DeepSeek-V3, a chatbot that appears to rival OpenAI’s ChatGPT from the standpoint of industry performance metrics, but which was claimed to have been created for a fraction of the cost using Nvidia’s now-restricted H800 chips. We did not believe that the DeepSeek development materially changed the level of investment needed to develop the next generation of frontier models as companies strive for AGI (artificial general intelligence) and beyond. We believe this view is supported by the unchanged plans for AI investment by the industry’s leading spenders. Following the news, some of the world’s largest investors in AI technology, including Meta, Microsoft, and Alphabet, reaffirmed and expanded on their intention to spend tens of billions of dollars in 2025. We believe this supports our thesis that Nvidia’s accelerated computing technology remains crucial to achieving AGI and other AI advances. Further, Nvidia noted that the success of DeepSeek, which employs reasoning AI, has itself been a driver of strong demand. With reasoning AI, as opposed to providing a “one-shot” answer based on statistical probabilities and existing patterns, the model spends more time refining the answer by running it through the model multiple times before outputting an answer that is more accurate and nuanced. As a result, reasoning AI is more compute intensive and can require 100 times more computing power per task than one-shot inferencing. With continued evidence that greater capabilities can be achieved with greater computing power and expanding use cases such as agentic AI, we believe both near-term and long-term demand will remain strong…” (Click here to read the full text)