Investing
2 AI Stocks That Could See Faster Growth Than NVIDIA
Published:
Last Updated:
No stock in the S&P 500 has seen recent growth at levels approaching NVIDIA (Nasdaq: NVDA). Last quarter the company saw sales growth of 265% while profits soared an absurd 769%.
Yet, a recent estimate for growth across different AI technology markets caught my attention. While the processors NVIDIA makes have been the hottest growth market in AI, another market could see even higher growth in the years ahead.
The estimate for AI market growth that caught my attention came from Spear Invest. I’ve attached the data below in a table.
Market | 2023 Sales (Billions) | 2027 Sales (Billions) | Compounded Annual Growth Rate (CAGR) |
Power Management | $12 | $25 | 20% |
Processors | $110 | $400 | 38% |
Thermal Management | $5 | $20 | 40% |
Networking | $25 | $100 | 41% |
HBM | $3 | $30 | 78% |
The growth in all of these markets is fairly eye-popping. That’s thanks to a projection for broad growth across the AI space. And to be sure, the largest market remains processors, which would leave NVIDIA in an ideal position.
However, the fastest projected growth rate belongs to HBM, or high bandwidth memory. Its growth rate even surpasses thermal management, which is a space investors have been flocking to via stocks like Vertiv (NYSE: VRT).
HBM is projected to reach a 78% compounded annual growth rate (CAGR). That’s in part thanks to it starting at a low level – just a $3 billion market in 2023 – but its ten-fold growth through 2027 is enough to turn heads.
The future of AI data centers focuses on two areas: limiting power used and also increasing how fast processors can communicate.
HBM addresses both these issues. Traditional RAM used by graphics processors (Graphics Double Data Rate, or GDDR) uses too much power and lags behind the performance gains of the advanced graphics processors used in AI workloads.
HBM helps “solve” for the GDDR bottleneck by stacking memory chips that have ultra-fast communication You can see a comparison of this new type of memory in this graphic prepared by AMD:
Pay particular attention to the lower voltage, bandwidth, and bus width areas which all have substantial gains. While voltage gains may not look as significant, on a per bandwidth basis that’s a 300% improvement!
As noted earlier, this is tremendously important as the future of AI data centers requires faster communication and less power.
If you’re looking for HBM investments, I’d recommend starting your search with Micron (Nasdaq: MU) and SK Hynix.
On its recent earnings call, Micron had the following to say about HBM (emphasis mine):
“HBM is a very exciting product. We have an industry-leading product here, best performance specs, 30% lower power, a lot of customer demand, more customer demand than we know how to meet.
And you have heard already in our prepared remarks and in the following Q&A that 2024 is sold out, 2025, overwhelming majority already allocated. So we are going to obviously continue to be very focused on disciplined investments and CapEx, whether it’s for HBM, whether it’s for DRAM, whether it’s for NAND to ensure that our target of sometime in ’25 getting HBM share to be equal to DRAM share.”
The two big notes to follow here are that Micron has already sold out its HBM allocation for 2024 and is almost entirely sold out for 2025 as well.
In addition, it can be difficult investing in a growth product like HBM where it’s replacing a prior product from the same company. From that perspective, Micron issuing a target of HBM share equal to DRAM in 2025 is significant.
Micron has been a popular “AI play,” but its financial results hadn’t really been showing impressive AI growth until last quarter when everything changed in a hurry.
In the three months ending November 30th last year, Micron had seen sales growth of 15.7% over the prior year. Last quarter its sales growth soared to 57.7%!
As you can imagine, that growth is largely coming from booming artificial intelligence demand. 2025 revenue estimates for Micron now sit at $34.4 billion, which is a 41% jump from 2024. GAAP net income in 2025 is forecasted to hit $8.2 billion, which would put Micron at 16X 2025 earnings.
That level might look dirt cheap for a company at the forefront of an AI trend growing at a CAGR of 78%.
Yet, long-time semiconductor investors will tell you that Micron has long been a cyclical stock. It has burned many investors who invested at its peaks right before earnings rapidly fell.
The big question today is whether AI is a “supercycle” that will break semiconductors out of their traditional cycles. If AI demand runs hot for years to come as companies invest heavily to keep up and the target of AGI within the next 5 years is within site, both Micron and SK Hynix will almost certainly reward investors from today’s prices.
However, if there’s a pause in AI spending as hyperscalers like Amazon, Microsoft, and Google pause waiting for more consumer and business adoption of AI services, then Micron and SK Hynix could see significant downside as another semiconductor cycle plays out.
If you’re looking at a choice between Micron and SK Hynix, it’s worth noting that SK Hynix is a Korean company. It trades at a discount to Micron when looking at 2025 projected earnings. It’s trading for just 8X 2023 net income estimates!
However, SK Hynix trades primarily on the Korean market. So if you’re looking to buy it you’ll likely need an account that allows for global trading like a Schwab Global Account. Micron can be purchased in just about any brokerage account.
Thank you for reading! Have some feedback for us?
Contact the 24/7 Wall St. editorial team.