Investing

Is Micron Technology The Future of Artificial Intelligence?

ipopba / iStock via Getty Images

Talk about artificial intelligence and you immediately think about Nvidia (NASDAQ:NVDA), Broadcom (NASDAQ:AVGO), or even Palantir Technologies (NASDAQ:PLTR). But what if you have it all wrong? What if the future of AI is really Micron Technology (NYSE:MU)? That’s the reality we are looking at and it was Nvidia’s Jensen Huang who highlighted where the technology is moving.

24/7 Wall St. Key Points:

  • It was Nvidia (NVDA) CEO Huang highlighting where AI is heading that shows how critical Micron Technology (MU) will be to the technology’s advancement.
  • As AI gets pushed closer to the edge, the need for greater memory and storage on PCs and smartphones will drive a critical increase in MU’s chip demand.
  • If you’re looking for some stocks with huge potential, make sure to grab a free copy of our brand-new “The Next NVIDIA” report. It features a software stock we’re confident has 10X potential.

Moving AI closer to the user

VioletaStoimenova / E+ via Getty Images
As AI gets pushed closer to the edge, Micron Technology’s memory and storage solutions will be in high demand

On the surface, Micron is a counterintuitive choice. Because most AI discussion is centered around accelerators like Nvidia’s upcoming Blackwell chips, large language models (LLM) such as Meta Platforms‘ (NASDAQ:META) Llama platform, or how companies like Palantir deploy the technology to crunch the data, Micron’s memory and data storage chips, though essential, also seem almost anachronistic.

Yet as AI gets pushed closer to the edge, into automobiles or onto PCs or smartphones, the amount of random access memory (RAM) available to run the models becomes a bottleneck. It is a delicate balance between efficiency and performance. Running Apple Intelligence on your iPhone is going to be a much slower experience than LLMs running in a data center.

However, if RAM can be increased on edge devices and endpoint use cases like cars, we will be able to experience the full power of the technology, and that’s where Micron comes in.

The need for speed

Micron Technology Inc.
Micron Technology’s high-bandwidth memory chips are just the start of a revolution in bringing AI closer to the user

Giving the keynote address at CES 2025, Huang mentioned that Micron is providing memory for Nvidia’s new GeForce RTX 50 Blackwell gaming chips. Micron, though, is not a stranger to AI or to Nvidia.

It was the first chipmaker to ship advanced high-bandwidth memory called HBM3E, which provides more than 20 times the memory bandwidth of standard D5-based DIMM server modules. It also does so at 30% lower power consumption. Micron’s second-generation HBM3E chip was also among the first to be qualified for Nvidia’s H200 and GH200 accelerators.

As AI gets pushed further out to the edge, more RAM is going to be needed. You can run a 3B (3 billion parameters) LLM on a PC or smartphone with 8 GB or 16GB of RAM, but it will be slow. Increasing memory will enable even faster processing. For example, HP‘s (NYSE:HPQ) latest AI PC comes with up to 32 gigabytes of RAM, an important step forward. But what if that was 128 GB? Huang revealed the Project DIGITS personal computer that comes packaged with just that plus 4 terabytes of SSD storage. It is approximately 1,000 times more powerful than the average laptop.

While its primarily meant for hardcore developers and researchers, not your general PC user, what if Apple (NASDAQ:AAPL) more memory than just 8 GB on its iPhone? The speed at which it will be able handle LLMs will be blazing fast.

As we advance from generative AI to agentic AI, which Huang points out will move into areas including coding assistants, customer service, and patient care, on to physical AI for autos and robotics, the need for greater memory and storage will increase exponentially.

Micro Technology is perfectly positioned to capture a lot of that demand.

Key takeaways

MU stock is trading at a discount. It certainly got a boost from Huang’s address, with shares rising more than 10% the following day, but the stock is still trading at very attractive valuations. Shares go for just 8 times next year’s earnings estimates with Wall Street forecasting 15% compounded annual growth rates over the next five years. That could be low.

The stock was hurt following last month’s fiscal 2025 first quarter earnings report where it dropped from around $110 a share to below $80 a stub. Much of that was due to Micron’s guidance for the coming quarter where it implied revenue would fall 9% sequentially. Although data center revenue surpassed 50% of total revenue for the first time ever as it grew 400% year-over-year and 40% sequentially, its NAND business is still impacted by lower PC and smartphone shipments.

It means the turnaround won’t happen overnight, but we can expect a dramatic U-turn to occur over the long haul. HBM chip growth over the next five years should be substantial, picking up the slack seen in NAND at the moment, but as AI and edge computing converge, Micron Technology could very well be the center where the future of AI technology is found.

 

Thank you for reading! Have some feedback for us?
Contact the 24/7 Wall St. editorial team.