Investing

3 NVIDIA Quotes from May 22 That Investors Can't Ignore

NVIDIA Stock Chart
Shutterstock / Piotr Swat

NVIDIA‘s (NASDAQ: NVDA) earnings call on May 22nd was a significant event for the investing world, showcasing incredible growth and strong future prospects. Yet, while the headline figures were strong sales and profits last quarter and upbeat guidance in the quarter ahead, it pays to study areas NVIDIA’s management chooses to focus on. In the discussion below, 24/7 Wall Street Analysts Eric Bleeker and Austin Smith break down three quotes from NVIDIA’s earnings that could shape where the stock is headed in the years to come.

The 3 Most Important Quotes from NVIDIA’s Earnings Call

If you closely followed NVIDIA’s earnings last week, you likely saw the company reported revenue of $26 billion, which exceeded Wall Street estimates of $25.6 billion. That beat was driven by NVIDIA’s Data Center group posting $22.6 billion in sales. That was an incredible 427% growth rate from last year.

Yet, whether NVIDIA’s stock continues to beat the market in the year ahead depends on a few key areas. Here are the three quotes from NVIDIA’s conference call that Eric Bleeker identified as the most important for investors to focus in on. Below you’ll find quotes followed by our analysis of why they’re so important.

“Training and inferencing AI on NVIDIA CUDA is driving meaningful acceleration in cloud rental revenue growth, delivering an immediate and strong return on cloud providers’ investment. For every $1 spent on NVIDIA AI infrastructure, cloud providers have an opportunity to earn $5 in GPU instant hosting revenue over 4 years.”

This quote is important because one of the strongest “bear” arguments against NVIDIA is their dependence on hyperscalers like Microsoft, Google, and Amazon which are buying up GPUs for cloud computing. Bears worry that if these companies start seeing poor returns from all the GPUs they’re buying, they’ll quickly pull the rug on spending and NVIDIA’s sales growth will plummet and the stock will crash.

NVIDIA put this quote near the beginning of their earnings call for a reason. It demonstrates that hyperscalers can get a phenomenal return on investment. If they’re getting even close to a 5:1 return on GPU purchases, there would be little reason to curtail spending any time soon.

In our trailing 4 quarters, we estimate that inference drove about 40% of our Data Center revenue. Both training and inference are growing significantly.”

This was another important quote from NVIDIA’s conference call. So far, AI sales from NVIDIA have mostly been for “training” AI Models. This is a market where NVIDIA has stronger advantages. However, as more AI workloads move from training to “inferencing” (which means actually running the models), there’s been doubts that NVIDIA can maintain its dominant market share.

NVIDIA will likely continue updating the percentage of revenue they’re getting for inferencing. If the ratio continues rising from 40% of revenue, that would be a very important “bullish” signal for the stock in the future as more revenue shifts toward inferencing workloads.

Demand for Hopper during the quarter continues to increase. Thanks to CUDA algorithm innovations, we’ve been able to accelerate LLM inference on H100 by up to 3x, which can translate to a 3x cost reduction for serving popular models like Llama 3.”

NVIDIA wants customers to buy its integrated solutions. While the media focuses on NVIDIA’s chip sales, the truth is the company wants to sell entire systems. As one example, while its “compute” revenue grew five-fold last quarter, NVIDIA’s networking revenue wasn’t far behind. It grew by three-fold.

Advantages like CUDA (NVIDIA’s software layer for GPUs) help the company build an integrated solution that’s more attractive to customers using only NVIDIA hardware. The company is pushing hard for customers to buy its GB200 rack-scale system that costs up to $3 million per unit. Reports out of Taiwan report that NVIDIA could be preparing to sell up to 40,000 GB200s. If NVIDIA were to hit those sales totals, it’s likely they would exceed Wall Street expectations and the stock would continue delivering in the year ahead.

Transcript:

Eric, there’s Black Monday and now the inverse is May 22nd.

NVIDIA’s incredible earnings that happened absolutely lit the investing world on fire.

I mean, just absolutely eye-popping numbers and statements that you don’t typically see of companies that have surged however many tens of thousands percent they’ve gone up the last few years and at their current market cap in the trillions.

So looking at May 22nd, the most important earnings call that I’ve seen, I can recall in recent memory, what are the three most important things that NVIDIA said?

Yeah, you know, if we look at NVIDIA’s earnings, as you said, they were up 9.3% the next day.

It was a triumph once again, after the previous quarter where they ran, their stock ran right after earnings and it was because revenue came in at 26 billion.

That’s a headline.

Wall Street was expecting $24.5 billion or so driven by the data center segment.

Austin, I know you sent this number to me and said, is this a typo?

It’s not a typo.

We grew 427% from last year.

Unfathomable to see a company NVIDIA size growing a unit that much.

So I went through NVIDIA’s conference call.

I picked out three of the most important quotes because you’re always looking what management is trying to highlight and what that says about the future for the company.

So the first quote is from NVIDIA CFO and says, training and inferencing on AI, NVIDIA CUDA is driving meaningful acceleration in cloud rental revenue growth delivering immediate and strong return on cloud providers investment.

For every $1 spent on NVIDIA AI infrastructure, cloud providers have an opportunity to earn $5 in GPU instant hosting revenue over four years.

Why does that quote matter?

Well, it’s the $5 per every dollar spent because one of the biggest bear arguments against NVIDIA is they’re dependent on a handful of customers.

I’m talking about large hyperscalers like Microsoft, Amazon, Google for their revenue.

And the argument goes, if these companies start seeing poor returns from all the GPUs they’re building, they pull back on spending.

Well, that would lead NVIDIA to have its revenue growth pull back and its stock price would crash.

The quote coming near the beginning of the NVIDIA earnings sends a clear signal that hyperscalers are getting a great return on their investment, not just a two to one, not just a three to one, but putting a five to one shows that GPU buying for the foreseeable future doesn’t look poised to slow down.

So here’s the second quote I want to go over.

In our trailing four quarters, we estimate that inference drove about 40% of our data center revenue.

Both training and inference are growing significantly.

So why is this so important?

Well, one of the key questions about NVIDIA is as we move from a lot of data center and AI chip sale revenue moving from training to inferencing, which is basically running AI models, will NVIDIA be in a strong of a competitive position?

And I think them saying that even at this point, getting 40%, that’s a number that’s very impressive.

And it’s NVIDIA saying to the market that they’re going to continue to hold a relatively dominant position, even as this transition happens.

And third, this quote says, demand for Hopper during the quarter continues to increase.

Thanks to CUDA algorithm innovations, we’ve been able to accelerate LLM inference on H 100 by up to three X, which can translate to a three X cost reduction for serving popular models like llama three.

Now a little bit of an AI word salad there, but Hey, here’s, here’s the key point that you need to understand.

NVIDIA wants to be integrated.

They don’t just want to sell chips.

They want to sell entire solutions.

They want to sell networking and their GPU revenue, while it grew fivefold, and that’s mighty impressive, their networking revenues also grew threefold last quarter.

So CUDA, which is their software kind of vendor lock, and you’d call it, it’s just part of the integrated solution.

I think one of the biggest surprises this next generation appears to be, NVIDIA’s GB200 system, which I think it has up to 72 GPUs, NVIDIA networking.

It’s just a full solution.

It goes for up to 3 million and reports out of Taiwan have been making up to 40,000 of these in the next generation.

I think if you’re looking for where NVIDIA is going to surprise on the upside in the next year, it’s because of their success selling these N-grade solutions and they really want to hammer that.

And look, for anybody doing the quick math at home, that is over $100 billion in revenue just from that product line, which is a shockingly large number to say.

Staggering numbers, Austin.

That’s correct.

Find a Qualified Financial Advisor (Sponsor)

Finding a qualified financial advisor doesn’t have to be hard. SmartAsset’s free tool matches you with up to 3 fiduciary financial advisors in your area in 5 minutes. Each advisor has been vetted by SmartAsset and is held to a fiduciary standard to act in your best interests. If you’re ready to be matched with local advisors that can help you achieve your financial goals, get started now.

Thank you for reading! Have some feedback for us?
Contact the 24/7 Wall St. editorial team.