AI Portfolio

Red-Hot AI IPO Cerberas Claims Chips are 20X Faster Than NVIDIA (NVDA)

Cerebras IPO
Canva

Right now, NVIDIA (Nasdaq: NVDA) has a dominant market share in AI chips. Companies like Broadcom (Nasdaq: AVGO) and Marvell (Nasdaq: MRVL) challenge them making custom chips while Advanced Micro Devices (Nasdaq: AMD) provides competition in GPUs.

Yet, NVIDIA’s market share is widely quoted at more than 70% of all AI chips. One competitor of interest is Cerebras. The company makes ‘Wafer Scale Engines’ that are gigantic chips with more than 4 trillion transistors. That compares to a little more than 200 billion in NVIDIA’s most advanced chips.

Cerebras recently filed paperwork to IPO and is expected to begin trading in 2025. In the article and video below 7Investing Founder Simon Erickson and 24/7 Analyst Eric Bleeker look at whether Cerebras is an IPO investors should scoop up next year.

Key Points in this Article

  • Cerebras is expected to IPO in 2025.
  • The company has a radically different approach to building AI chips that they claim is up to 20 times faster than NVIDIA’s H100 Hopper chips.
  • If you’re looking for top ideas in the AI space – grab a complimentary copy of our “The Next NVIDIA” report that details three companies poised to dominate the next wave of AI growth.

Is Cerebras a Top IPO in 2025?

Below are some of the key points from this conversation:

  • Cerebras filed its prospectus for an IPO on September 30, 2024.
  • The company is running at a significant loss – losing $66.6 million in the first half of 2024 on sales of $136.4 million.
  • In addition, Cerebras is currently highly dependent on a single customer.
  • Yet, while the company’s financials aren’t very compelling, their approach to AI chips is very unique.
  • Cerebras builds massive chips named Wafer Scale Engines. These Wafer Scale Engines have more than 4 trillion transistors, a number far higher than NVIDIA’s much smaller chip dies.
  • The company claims its chips are 10 to 20 times faster at inference than NVIDIA’s H100 chips. Inference is rapidly becoming more important as AI training increasingly uses inference as a key differentiator in optimizing models (thanks to emerging methods like test-time compute).
  • The downside is that Cerebras’ chips are expensive and difficult to make. However, with AI in an arm’s race signing just a few customers could lead to strong performance after the company IPOs.
  • Eric notes that there are some details in Cerebas’ IPO such as a generous release clause for insiders that gives him pause, but we’ll continue to cover the company on 24/7 Wall St as it approaches an IPO date.

Transcript

The following is a transcript of Eric and Simon’s conversation that’s been lightly edited.

[00:00:00] Eric Bleeker: We’re offering up three ideas today. Another one that you want to talk about is an upcoming IPO, a potential alternative to NVIDIA, which, you know, should get everyone’s hearts beating a little bit faster. So let’s talk about what you’re looking at in the potential pre-IPO space in, you know, AI right now.

[00:00:17] Simon Erickson: Well, everybody loves NVIDIA. Every, everybody, every time you say Nvidia, somebody smiles, think about how much money they made off of the stock for the last decade, right? But Eric, you’re, you’re smiling right now. I know, you know how this feels. I mean, one of the darlings of the stock market and why, you know, because it’s creating a lot of value that it’s just being used and it’s almost a must have, and there’s a lot of people that say, Oh, our chips are better than Nvidia’s.

[00:00:38] Simon Erickson: Uh, because if you say that you can add, you know, three or four more zeros to the evaluation you ask a venture capitalist for, but, but one that really might be doing it is called Cerebras systems. Uh, this is a company that is expected to IPO here in 2025, early 2025, not publicly traded right now. We’re just speculating that this is going to be publicly traded here very soon.

[00:01:00] Simon Erickson: They filed all the paperwork and are probably going to do it here within the next couple of months. Thanks. But this is a company that says, you know, AI is transformative, right? And it wants to create the chips that are as efficient as possible for machine learning inference. You’ve got the training, you know, you have to have a car recognize what a stop sign is, you know, you have to teach the car what that is, but then you actually have to have the car respond to that on the inference side of it, the understanding piece of the computing that’s related to AI and Cerebus has built these massive wafer scale engines is what they call them.

[00:01:32] Simon Erickson: I mean, the chips are this big. Yep, they’re massive. You know, if you see a picture of Cerebus’s chips, the chips are just a massive, and they’re going to be competing against Nvidia’s GPUs. That have got 50 times more computing cores, 880 times more on chip memory, and 7, 000 times more memory bandwidth. Um, you know, 57 times larger than the commercially available GPUs today.

[00:01:54] Simon Erickson: Why is that important? Well, because if you are building a neural network that is going to be using machine learning inference for really, really hard things, you want to have your computing as efficient as possible. The number of trillions of operations that you can do per second, the teraflops per second, uh, compared to the power that you are using, that you are putting into the service to do these things, um, you have to have that efficient because these are just so power hungry out there right now.

[00:02:20] Simon Erickson: And I think Cerebras, if it can actually go out there and show that, um, you know, it’s claiming that it’s an order of magnitude more efficient than GPUs, 10 times more efficient, I think that you’re going to start seeing some of the big data centers. Who are doing AI computing, you know, the infrastructure as a service providers, the Microsofts and the Googles of the world, but then also those that are building their own data centers, like the meta platforms and, and what have you, um, they’re, they’re going to start winning some really, really big contracts.

[00:02:46] Simon Erickson: And so I just think Cerebras, you know, we don’t know as much about them as we know about Nvidia. They’re new. Um, there’s a lot of questions about the stock in this company, but my goodness, what a great place at a great time to do an IPO. They’re right in the middle of the hottest market trend in the entire world, right?

[00:03:02] Eric Bleeker: Yeah. I, I think there’s no question. Demand’s going to be very, um, amplified for this. It’s, it’s funny enough. I’ve gotten the question from people before, um, you know, why, why cut up these wafers into these. Tiny chips are trying to cram transistors. I’m like, well, there’s a company trying to do the alternative.

[00:03:18] Eric Bleeker: And it’s what you’re describing. It’s a radical departure from kind of how the entire chip industry has been built. One question though, Simon, I guess, where do you see the opportunity here in the sense we’ve, we’ve gotten video on one side, but it feels like a lot of these hyperscalers that control so much of the demand, they’ve really placed their bets.

[00:03:38] Eric Bleeker: Yes. On their own custom chips as an alternative. We’ve got Trainium from Amazon. We’ve got, uh, TPUs from Google. So where is the market? Because I, when I looked at Cerebrases, uh, filing their S1, they currently have basically one customer, which is a quasi, um, Sovereign, uh, company from the Middle East. So where’d you see maybe demand coming from in the next year or two?

[00:04:02] Eric Bleeker: Who, who would buy these chips?

[00:04:04] Simon Erickson: Yeah, it’s super, super expensive to even do this. Like even the R and D of developing your own chip, just the design 30 million per chip, right? So a lot of companies don’t even have. staff or the scientists or the R and D budget to even spend on things like that. And they got Microsoft, you know, in addition to all the great names, you just mentioned the development of Microsoft is the next one that’s going to go out there and has already been working on its own custom AI chip is going to spend 10 billion on manufacturing design.

[00:04:31] Simon Erickson: And then all of the, you know, the, the operating expenses should do something like that. I mean, if you don’t have 10 billion sitting around to commit to something of this magnitude, like the hyperscalers do, you’re going to look for the next best thing that’s out there that’s available. And when we’re talking about, you know, orders of magnitude between 10 billion that Microsoft alone is spending, Cerebris last year, uh, in fiscal 2023, did 78 million in revenue total, total revenue.

[00:04:57] Simon Erickson: So there is certainly a Delta between the two of those, but could you triple that revenue again, just like they tripled the revenue in 2023, going to triple again, 2024 and beyond lines, you know, 200, 200 million, maybe a billion dollars a year, maybe even up to, you know, three or 4 billion. Yes, I think there’s certainly, you know, 10X potential for a company like this, given that the mass market doesn’t have the resources of the hyperscalers like you just mentioned.

[00:05:20] Eric Bleeker: Yeah, and I guess, you know, that’s a good point in the sense that one kind of a medium-sized customer could result in a tripling and all of a sudden your way ahead of consensus and there’s, there’s rabid demand for sure. So if, if you were an investor out there, we haven’t talked about, uh, this particular company on the podcast, y’all probably do a little bit.

[00:05:39] Eric Bleeker: More as it comes closer to the market. But if you were sizing this, how would you size it in a portfolio? Is this kind of a 1 percent um, hedge against this, making some room against a larger company like Nvidia, would you actually be a little bit more aggressive at where, how would you fit this in kind of a portfolio strategy, adding to AI stocks?

[00:06:00] Simon Erickson: Currently, I have Cerebras on our watch list of new ideas for 7Investing. Uh, that’s kind of obvious because it’s not publicly tradable anyway, but you know, these are kind of companies that I think are interesting, but I’m still playing a wait and see approach to it. Uh, generally you, you don’t want to buy in an IPO.

[00:06:17] Simon Erickson: The reason companies IPO when they do is because typically the market is hot. There’s a lot of demand that they, that they can showcase, you know, the people that want to invest. And out of the gate, they’re able to raise as much money at as high of a valuation as possible. And typically that kind of comes back down to earth over the following six months.

[00:06:34] Simon Erickson: There’s also lockup periods on things associated with that. But, um, generally I would not recommend, uh, buying right at any IPO and having to get into the hottest thing at the time. You know, there are expectations built into stock prices all the time. I still am saying a wait and see, but certainly an interesting company to keep an eye on.

[00:06:51] Eric Bleeker: Yeah. And I did notice in their S1, there’s a release clause. That insiders can get out within five days if the stock is up 25%. So again, we, I think we will talk about this more on this podcast after it’s come to the market, but, uh, a good reason to maybe it’s, there’s going to be a lot of demand for it, but also the devil’s in the details a lot of times with these companies coming to market.

 

Thank you for reading! Have some feedback for us?
Contact the 24/7 Wall St. editorial team.