Alphabet (NASDAQ:GOOG | GOOG Price Prediction) has really emerged as a force in AI chips in the past year. With the eighth generation of Google TPUs (tensor processing units) being unveiled, just in time for the age of agentic AI, perhaps it’s time to start viewing shares of Alphabet as worthier of a greater premium, especially as the TPU becomes most capable at competing against the GPU.
Of course, they’re different pieces of hardware that have different specialties. However, as AI continues to gravitate towards inference (an inference inflection point, if you will), it certainly seems like Google’s TPU stands out, especially as each generation leaps over the last. It’s not just inference where TPUs are starting to shine, though. The latest TPUs really do stand out when it comes to cost-effective training. Of course, time will tell how many firms go for the all-new TPU 8t and 8i for training and inference, respectively.
At the end of the day, Google is still a big customer of Nvidia (NASDAQ:NVDA) GPUs. As its own silicon efforts rocket higher, though, the big question is whether TPUs have what it takes to meaningfully take share away from Nvidia.
Google’s latest TPUs are quite impressive
In any case, the latest and greatest TPUs aren’t just faster than the last generation, they’re cheaper, meaningfully cheaper. The TPU 8i touts 80% memory improvement in SRAM (static random access memory) for agentics over the last generation (Ironwood). That’s some serious performance gains, which shows that it’s not just Nvidia that can make massive performance jumps. What’s most striking, at least in my humble opinion, is the watts-per-token performance.
In an era where investors are sick and tired of higher CapEx bills, much of which is being spent on AI hardware, the math matters, and it might make more of a difference over time. Perhaps there’s a reason why Anthropic, the AI innovator behind Claude, which is laser-focused on efficiency rather than “spending at all costs,” picked up to 1 million v7 TPUs. Even OpenAI is warming up to TPUs, which may very well allow Google to attract a bit more of that Nvidia multiple.
Google is ready for the inference pivot
Either way, it’s becoming harder to believe that Nvidia’s impressive margins will stay elevated for the long haul, especially if the TPU cadence continues at such a pace. Bernstein analyst Mark Newman seems to think Google will win as AI pivots from training to inference.
Indeed, inference seems to be the big question mark that AI chip investors are pondering. And while there’s no reason why Nvidia and Google can’t both win as agents power AI demand that’s off the charts, one has to think that the puck is starting to (slowly) move in Google’s direction.
This alone may justify a price-to-earnings (P/E) multiple closer to the mid-30s. As TPUs continue to gain traction, one has to think that Google Cloud Platform (GCP) will also be due for some considerable sales and margin growth. Any way you look at it, more frontier AI innovators are starting to look towards TPUs. And in an era where AI compute is getting absurdly expensive, I’d argue that efficiency gains are where it’s at.
All considered, the 8t and 8i TPUs were impressive. And as more titans in AI gravitate towards TPUs, it might be time to think about all the ways the firm could begin to nibble at Nvidia’s big slice of the pie.
It’s not curtains for Nvidia yet
Now, Nvidia can play defense better than anyone. It’s made more than its fair share of inference pivots as well. Either way, the AI chip scene is getting more interesting, and the GPU titan will need to stay on its toes as competition really starts to heat up. With the 10x performance-per-watt rise in Vera Rubin (over Blackwell), Nvidia is no slouch. And with Groq aboard, the race with TPUs and other AI chips is about to get very interesting.