As pointed out in the article, Nvidia has several advantages including:
- Better Linux drivers than AMD
- CUDA
- pytorch is optimized for Nvidia
- High-speed interconnect
Each of the advantages is under attack:
- George Hotz is making better drivers for AMD
- MLX, Triton, JAX: Higher level abstractions that compile down to CUDA
- Cerbras and Groq solve the interconnect problem
The article concludes that NVIDIA faces an unprecedented convergence of competitive threats. The flaw in the analysis is that these threats are not unified. Any serious competitor must address ALL of Nvidia's advantages. Instead Nvidia is being attacked by multiple disconnected competitors, and each of those competitors is only attacking one Nvidia advantage at a time. Even if each of those attacks are individually successful, Nvidia will remain the only company that has ALL of the advantages.
* Groq can't produce more hardware past their "demo". It seems like they haven't grown capacity in the years since they announced, and they switched to a complete SaaS model and don't even sell hardware anymore.
The same Hotz who lasted like 4 weeks at Twitter after announcing that he'd fix everything? It doesn't really inspire a ton of confidence that he can single handedly take down Nvidia...
They have their own nn,etc libraries so adapting should be fairly focused and AMD drivers have a hilariously bad reputation historically among people who program GPU's (I've been bitten a couple of times myself by weirdness).
I think you should consider it as, if they're trying to avoid Nvidia and make sure their code isn't tied to NVidia-isms, and AMD is troublesome enough for basics the step to customized solutions is small enough to be worthwhile for something even cheaper than AMD.
> Any serious competitor must address ALL of Nvidia's advantages.
Not really, his article focuses on Nvidia's being valued so highly by stock markets, he's not saying that Nvidia's destined to lose its advantage in the space in the short term.
In any case, I also think that the likes of MSFT/AMZN/etc will be able to reduce their capex spending eventually by being able to work on a well integrated stack on their own.
They have an enormous amount of catching up to do, however; Nvidia have created an entire AI ecosystem that touches almost every aspect of what AI can do. Whatever it is, they have a model for it, and a framework and toolkit for working with or extending that model - and the ability to design software and hardware in lockstep. Microsoft and Amazon have a very diffuse surface area when it comes to hardware, and being a decent generalist doesn’t make you a good specialist.
Nvidia are doing phenomenal things with robotics, and that is likely to be the next shoe to drop, and they are positioned for another catalytic moment similar to that which we have seen with LLMS.
I do think we will see some drawback or at least deceleration this year while the current situation settles in, but within the next three years I think we will see humanoid robots popping up all over the place, particularly as labour shortages arise due to political trends - and somebody is going to have to provide the compute, both local and cloud, and the vision, movement, and other models. People will turn to the sensible and known choice.
So yeah, what you say is true, but I don’t think is going to have an impact on the trajectory of nvidia.
>So how is this possible? Well, the main reasons have to do with software— better drivers that "just work" on Linux and which are highly battle-tested and reliable (unlike AMD, which is notorious for the low quality and instability of their Linux drivers)
This does not match my experience from the past ~6 years of using AMD graphics on Linux. Maybe things are different with AI/Compute, I've never messed with that, but in terms of normal consumer stuff the experience of using AMD is vastly superior than trying to deal with Nvidia's out-of-tree drivers.
He's setting up a case for shorting the stock, ie if the growth or margins drop a little from any of these (often well-funded) threats. The accuracy of the article is a function of the current valuation.
Exactly. You just need to see a slight deceleration in projected revenue growth (which has been running 120%+ YoY recently) and some downward pressure on gross margins, and maybe even just some market share loss, and the stock could easily fall 25% from that.
That is extraordinarily simplistic. If NVDA is slowing and AMD has gains to realize compared to NVDA, then the 10x difference in market cap would imply that AMD is the better buy. Which is why I am long in AMD. You can't just look at the current P/E delta. You have to look at expectations of one vs the other. AMD gaining 2x over NVDA means they are approximately equivalently valued. If there are unrealized AI related gains all bets are off. AMD closing 50% of the gap in market cap value between NVDA and AMD means AMD is ~2.5x undervalued.
Disclaimer: long AMD, and not precise on percentages. Just illustrating a point.
The point is, it should not be taken for granted that NVDA is overvalued. Their P/E is low enough that if you’re going to state that they are overvalued you have to make the case. The article while well written, fails to make the case because it has a flaw: it assumes that addressing just one of Nvidia’s advantages is enough to make it crash and that’s just not true.
My point is that you have to make the case for anything being over/undervalued. The null hypothesis is that the market has correctly valued it, after all.
If medium to long term you believe the space will eventually get commoditized I the bear case is obvious. And based on history there's a pretty high likelihood for that to happen.
You have to look at non-gaap numbers, and therefore looking at forward PE ratios is necessary. When you look at that, AMD is cheaper than NVDA. Moreover, the reason why AMD PE ratio looks high is because they bought xilinx, and in order to save on taxes, it makes their PE ratio look really high.
NVDA is valued at $3.5 trillion, which means investors think it will grow to around $1 trillion in yearly revenue. Current revenue is around $35 billion per quarter, so call it $140 billion yearly. Investors are betting on a 7x increase in revenue. Not impossible, sounds plausible but you need to assume AMD, INTC, GOOG, AMZN, and all the others who make GPUs/TPUs either won't take market share or the market will be worth multiple trillions per year.
Tech companies are valued higher because lots of people think there's still room for the big tech companies to consolidate market share and for the market itself to grow, especially as they all race towards AI. Low interest rates, tech and AI hype add to it.
Funny timing though, today NVDA lost $589 billion in market cap as the market got spooked.
No thats not true. Hedge funds get paid so well because getting a small percentage of a big bag of money is still a big bag of money. This statement is more true the closer the big bag of money is to infinity.
Unless something radically changed in the last couple years, I am not sure where you got this from? (I am specifically talking about GPUs for computer usage rather than training/inference)
> Unless something radically changed in the last couple years, I am not sure where you got this from?
This was the first thing that stuck out to me when I skimmed the article, and the reason I decided to invest the time reading it all. I can tell the author knows his shit and isn't just parroting everyone's praise for AMD Linux drivers.
> (I am specifically talking about GPUs for computer usage rather than training/inference)
Same here. I suffered through the Vega 64 after everyone said how great it is. So many AMD-specific driver bugs, AMD driver devs not wanting to fix them for non-technical reasons, so many hard-locks when using less popular software.
The only complaints about Nvidia drivers I found were "it's proprietary" and "you have to rebuild the modules when you update the kernel" or "doesn't work with wayland".
I'd hesitate to ever touch an AMD GPU again after my experience with it, haven't had a single hick-up for years after switching to Nvidia.
Another ding against Nvidia for Linux desktop use is that only some distributions either make it easy to install and keep the proprietary drivers updated (e.g. Ubuntu) and/or ship variants with the proprietary drivers preinstalled (Mint, Pop!_OS, etc).
This isn’t a barrier for Linux veterans but it adds significant resistance for part-time users, even those that are technically inclined, compared to the “it just works” experience one gets with an Intel/AMD GPU under just about every Linux distro.
they are, unless you get distracted by things like licensing and out of tree drivers and binary blobs. If you'd rather pontificate about open source philosophy and rights than get stuff done, go right ahead.
Check out Anthonix on Twitter. He's already done what George Hotz is trying to do and he did it months ago. He's moved on from the RX 7900 XTX to MI300X and is setting some records. He had to write the majority of the code by himself but kept some of ROCm he deemed fit. He is always stirring George up when he has his AMD tantrums. Seriously though, how bad are AMD engineers if one person in their free time can make a custom stack that out performs ROCm.
The unification of the flaws is the scarcity of H100s
He says this and talks about it in The Fallout section - even at BigCos with megabucks the teams are starved for time on the Nvidia chips and if these innovations work other teams will use them and then boom Nvidia's moat is truncated somehow which doesn't look good at such lofty multiples
George Hotz is a hot Internet celebrity that has basically accomplished nothing of value but has a large cult following. You can safely ignore.
(Famous for hacking the PS3–except he just took credit for a separate group’s work. And for making a self-driving car in his garage—except oh wait that didn’t happen either.)
He took an “internship” at Twitter/X with the stated goal of removing the login wall, apparently failing to realize that the wall was a deliberate product decision, not a technical challenge. Now the X login wall is more intrusive than ever.
Yes, but it's worth mentioning that the break consisted of opening up the phone and soldering on a bypass for the carrier card locking logic. That certainly required some skills to do, but is not an attack Apple was defending against. This unlocking break didn't really lead to anything, and was unlike the later software unlocking methods that could be widely deployed.
You’re not wrong, but after all these years it’s fair to give benefit of the doubt - geohot may have grown as a person. The PS3 affair was incredibly disappointing.
Given the number of times he has been on the news for bombastic claims he doesn’t follow through on, I don’t think we need to guess. He hasn’t changed.
What specifically is in comma.ai that makes it less technically impressive? Comma.ai looks like epic engineering to me. I haven't made any self driving cars.
Why do you think otherwise? Can you share specific details?
In which way? As a user who switched from an AMD-GPU to Nvidia-GPU, I can only report a continued amount of problems with NVIDIAs proprietary driver, and none with AMD. Is this maybe about the open source-drivers or usage for AI?
A new entrant, with an order of magnitude advantage in e.g. cost or availability or exportability, can succeed even with poor drivers and no CUDA etc. Its only when you cost nearly as much as NVidia that the tooling costs become relevant.
Those are definitely not the limiting factors here.
Not nearly all data centers are water cooled, and there is this amazing technology that can convert sunlight into electricity in a relatively straightforward way.
AI workloads (at least training) are just about as geographically distributeable as it gets due to not being very latency-sensitive, and even if you can't obtain sufficient grid interconnection or buffer storage, you can always leave them idle at night.
Solar microgrids are cheaper and faster than nuclear. New nuclear isn't happening on the timescales that matter, even assuming significant deregulation.
Well, prediction is very difficult, especially with respect to the future. But the fundamentals look good.
Current world marketed energy consumption is about 18 terawatts. Current mainstream solar panels are 21% efficient. At this efficiency, the terrestrial solar resource is about 37000 terawatts, 2000 times larger than the entire human economy:
~ $ units
Currency exchange rates from exchangerate-api.com (USD base) on 2024-11-25
Consumer price index data from US BLS, 2024-11-24
7290 units, 125 prefixes, 169 nonlinear units
You have: 21% solarirradiance circlearea(earthradius)
You want: TW
* 36531.475
/ 2.7373655e-05
So, once datacenters are using seven hundred thousand times more power than currently, we might need to seek power sources for them other than terrestrial solar panels running microgrids. Solar panels in space, for example.
You could be forgiven for wondering why this enormous resource has taken so long to tap into and why the power grid is still largely fossil-fuel-powered. The answer is that building fossil fuel plants only costs on the order of US$1–4 per watt (either nameplate or average), and until the last few years, solar panels cost so much more than that that even free "fuel" wasn't enough to make them economically competitive. See https://www.eia.gov/analysis/studies/powerplants/capitalcost... for example.
Today, however, solar panels cost US$0.10 per peak watt, which works out to about US$0.35 to US$1 per average watt, depending largely on latitude. This is 25% lower than the price of even a year ago and a third of the price of two years ago. https://www.solarserver.de/photovoltaik-preis-pv-modul-preis...
As pointed out in the article, Nvidia has several advantages including:
Each of the advantages is under attack: The article concludes that NVIDIA faces an unprecedented convergence of competitive threats. The flaw in the analysis is that these threats are not unified. Any serious competitor must address ALL of Nvidia's advantages. Instead Nvidia is being attacked by multiple disconnected competitors, and each of those competitors is only attacking one Nvidia advantage at a time. Even if each of those attacks are individually successful, Nvidia will remain the only company that has ALL of the advantages.