Dr. Michael Burry, the man made famous from The Big Short, has had a lot of things to say in recent months. And while his bearish bets have made some AI and Mag Seven-heavy investors second-guess their high-tech AI bets, I do think that many may be at risk of discounting some of the man’s previous words of warning. Even with his Substack and the numerous headlines going into depth on the moves he’s made and his thoughts on various themes, specifically regarding the AI bubble, it still feels like some of his comments are being taken with a grain of salt by some.
Of course, Dr. Burry is a genius who made one of the greatest trades of all time in the face of the Great Financial Crisis. And while there’s uncertainty as to whether he’s early to the AI bubble (if there’s one at all), I do think that it’s wise to take in what the man has to say, especially given his reputation, track record, and ability to take massive deep dives into the data that supports his theses. Though there have been a lot of Michael Burry headlines of late, I do think that there’s one piece of commentary that investors might be at risk of forgetting about: the chip depreciation schedules among the big hyperscalers.
How fast should hyperscalers be depreciating their AI chips?
Sure, it’s been some months since Burry first remarked on the matter. But it’s still a concerning point that I believe mega-cap tech should take more time to clear the air on. Of course, if it does turn out that the big hyperscalers are overestimating the “useful life” of their hardware (think those pricey Nvidia (NASDAQ:NVDA | NVDA Price Prediction) GPUs), there could be a considerable financial hit to be taken on the chin later on. So, the big question is whether GPUs live two to three years or if four to six years is more appropriate, given hyperscalers might actually be able to stretch the life of such hardware.
Of course, older GPUs can be repurposed, and there are ways to put less strain on chips. Could it be that GPUs are more like cars that could last longer if the proper care and maintenance are put into them over time, and they’re not pushed to their limit (think a car that’s being used for drag races versus regular daily commutes)? It’s hard to tell.
Personally, I have no idea whether it’s a stretch to say that what’s going on with depreciation is “one of the more common frauds of the modern era.” Either way, it’s my humble opinion that the accounting of it skews towards the more aggressive side. But, at the end of the day, some discretion needs to be made, and if firms can add a bit of innovation on top to extend the useful life of chips, perhaps there is savings to be had.
Vera Rubin’s big efficiency gains might be more of a concern
What’s more concerning, in my opinion, is that Nvidia’s coming Vera Rubin chips could make its older chips feel less than capable.
Could it be that they’re so efficient that the last generation of chips becomes technologically obsolete sooner rather than later?
That’s the big risk that I think could justify reducing the depreciation schedule on GPUs. Indeed, the risk for the hyperscalers is that Nvidia’s latest and greatest efficiency and performance gains are too good. Perhaps Blackwell stands to be a victim of Rubin’s success. Also, with so many hyperscalers building their own incredibly-efficient chips, perhaps there are bigger efficiency gains to be had over the next two to three years.
On the flip side, though, if Nvidia can’t keep the pace of efficiency gains going strong every generation, there is a good chance that older chips do suffice, in which case technological obsolescence becomes less of a concern. Either way, there’s risk here, and I think investors would be wise to take it into consideration.
Depreciation is a big deal, but the bigger question is whether the AI revolution is real
Despite the giant question marks surrounding AI chip depreciation, I believe that gains to be had from the AI revolution might make it such that a future write-down of hardware might be less of a needle-mover.
Of course, running the risk of overstating profits by double-digit percentage points is no drop in the bucket. But if AI monetization kicks into high gear (the software slump seems to suggest AI tools are the real deal), perhaps there are some positive offsets to think about as well. Any way you look at it, I think the efficiency-driven AI plays, like Apple (NASDAQ:AAPL), are better poised to gain from here.