Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Great article but it seems to have a fatal flaw.

As pointed out in the article, Nvidia has several advantages including:

   - Better Linux drivers than AMD
   - CUDA
   - pytorch is optimized for Nvidia
   - High-speed interconnect
Each of the advantages is under attack:

   - George Hotz is making better drivers for AMD
   - MLX, Triton, JAX: Higher level abstractions that compile down to CUDA
   - Cerbras and Groq solve the interconnect problem
The article concludes that NVIDIA faces an unprecedented convergence of competitive threats. The flaw in the analysis is that these threats are not unified. Any serious competitor must address ALL of Nvidia's advantages. Instead Nvidia is being attacked by multiple disconnected competitors, and each of those competitors is only attacking one Nvidia advantage at a time. Even if each of those attacks are individually successful, Nvidia will remain the only company that has ALL of the advantages.


I want the NVIDIA monopoly to end, but there is no real competition still. * George Hotz has basically given up on AMD: https://x.com/__tinygrad__/status/1770151484363354195

* Groq can't produce more hardware past their "demo". It seems like they haven't grown capacity in the years since they announced, and they switched to a complete SaaS model and don't even sell hardware anymore.

* I dont know enough about MLX, Triton, and JAX,


I also noticed that Groq's Chief Architect now works for NVIDIA.

https://research.nvidia.com/person/dennis-abts


That George Hotz tweet is from March last year. He's gone back and forth on AMD a bunch more times since then.


The same Hotz who lasted like 4 weeks at Twitter after announcing that he'd fix everything? It doesn't really inspire a ton of confidence that he can single handedly take down Nvidia...


is that good or bad?


I consider it a good sign that he hasn’t completely given up. But it sure all seems shaky.


Honestly I tried searching his recent tweets for AMD and there was way too much noise in there to figure out his current position!


" we are going to move it off AMD to our own or partner silicon. We have developed it to be very portable."

https://x.com/__tinygrad__/status/1879617702526087346


Honest question. That sounds more difficult that getting things to play with commodity hardware. Maybe I am oversimplifying it though.


They have their own nn,etc libraries so adapting should be fairly focused and AMD drivers have a hilariously bad reputation historically among people who program GPU's (I've been bitten a couple of times myself by weirdness).

I think you should consider it as, if they're trying to avoid Nvidia and make sure their code isn't tied to NVidia-isms, and AMD is troublesome enough for basics the step to customized solutions is small enough to be worthwhile for something even cheaper than AMD.


Thanks, I don't have any experience in this realm and this was helpful to digest the problem space.


It looks like he’s close to having own AMD stack, tweet linked in the article, Jan 15,2025: https://x.com/__tinygrad__/status/1879615316378198516


We'll check in again with him in 3 months and he'll still be just 1 piece away.


$1000 bounty? That's like 2 hours of development time at market rate lol


> Any serious competitor must address ALL of Nvidia's advantages.

Not really, his article focuses on Nvidia's being valued so highly by stock markets, he's not saying that Nvidia's destined to lose its advantage in the space in the short term.

In any case, I also think that the likes of MSFT/AMZN/etc will be able to reduce their capex spending eventually by being able to work on a well integrated stack on their own.


They have an enormous amount of catching up to do, however; Nvidia have created an entire AI ecosystem that touches almost every aspect of what AI can do. Whatever it is, they have a model for it, and a framework and toolkit for working with or extending that model - and the ability to design software and hardware in lockstep. Microsoft and Amazon have a very diffuse surface area when it comes to hardware, and being a decent generalist doesn’t make you a good specialist.

Nvidia are doing phenomenal things with robotics, and that is likely to be the next shoe to drop, and they are positioned for another catalytic moment similar to that which we have seen with LLMS.

I do think we will see some drawback or at least deceleration this year while the current situation settles in, but within the next three years I think we will see humanoid robots popping up all over the place, particularly as labour shortages arise due to political trends - and somebody is going to have to provide the compute, both local and cloud, and the vision, movement, and other models. People will turn to the sensible and known choice.

So yeah, what you say is true, but I don’t think is going to have an impact on the trajectory of nvidia.


>So how is this possible? Well, the main reasons have to do with software— better drivers that "just work" on Linux and which are highly battle-tested and reliable (unlike AMD, which is notorious for the low quality and instability of their Linux drivers)

This does not match my experience from the past ~6 years of using AMD graphics on Linux. Maybe things are different with AI/Compute, I've never messed with that, but in terms of normal consumer stuff the experience of using AMD is vastly superior than trying to deal with Nvidia's out-of-tree drivers.


They are.


He's setting up a case for shorting the stock, ie if the growth or margins drop a little from any of these (often well-funded) threats. The accuracy of the article is a function of the current valuation.


Exactly. You just need to see a slight deceleration in projected revenue growth (which has been running 120%+ YoY recently) and some downward pressure on gross margins, and maybe even just some market share loss, and the stock could easily fall 25% from that.


AMD P/E ratio is 109, NVDA is 56. Which stock is overvalued?


That is extraordinarily simplistic. If NVDA is slowing and AMD has gains to realize compared to NVDA, then the 10x difference in market cap would imply that AMD is the better buy. Which is why I am long in AMD. You can't just look at the current P/E delta. You have to look at expectations of one vs the other. AMD gaining 2x over NVDA means they are approximately equivalently valued. If there are unrealized AI related gains all bets are off. AMD closing 50% of the gap in market cap value between NVDA and AMD means AMD is ~2.5x undervalued.

Disclaimer: long AMD, and not precise on percentages. Just illustrating a point.


The point is, it should not be taken for granted that NVDA is overvalued. Their P/E is low enough that if you’re going to state that they are overvalued you have to make the case. The article while well written, fails to make the case because it has a flaw: it assumes that addressing just one of Nvidia’s advantages is enough to make it crash and that’s just not true.


If investing were as simple as looking at the P/E, all P/Es would already be at 15-20, wouldn't they?


Not saying it is as simple as looking at P/E


My point is that you have to make the case for anything being over/undervalued. The null hypothesis is that the market has correctly valued it, after all.


In the long run, probably yes, but a particular stock is less likely to be accurately value in the short run.


If medium to long term you believe the space will eventually get commoditized I the bear case is obvious. And based on history there's a pretty high likelihood for that to happen.


glad you are not my financial adviser :)


You have to look at non-gaap numbers, and therefore looking at forward PE ratios is necessary. When you look at that, AMD is cheaper than NVDA. Moreover, the reason why AMD PE ratio looks high is because they bought xilinx, and in order to save on taxes, it makes their PE ratio look really high.


rofl Forward PE ....


On the other hand, getting a bigger slice of the existing cake as a smaller challenger can be easier than baking a bigger cake as the incumbent.


Intel had a great P/E a couple of years ago as well :)


Hey let’s buy intel


NVDA is valued at $3.5 trillion, which means investors think it will grow to around $1 trillion in yearly revenue. Current revenue is around $35 billion per quarter, so call it $140 billion yearly. Investors are betting on a 7x increase in revenue. Not impossible, sounds plausible but you need to assume AMD, INTC, GOOG, AMZN, and all the others who make GPUs/TPUs either won't take market share or the market will be worth multiple trillions per year.


I thought the valuation of public companies at 3x revenues or 5x earnings has long since sailed?


Tech companies are valued higher because lots of people think there's still room for the big tech companies to consolidate market share and for the market itself to grow, especially as they all race towards AI. Low interest rates, tech and AI hype add to it.

Funny timing though, today NVDA lost $589 billion in market cap as the market got spooked.


If it were all so simple, they wouldn’t pay hedge fund analysts so much money…


No thats not true. Hedge funds get paid so well because getting a small percentage of a big bag of money is still a big bag of money. This statement is more true the closer the big bag of money is to infinity.


> The accuracy of the article is a function of the current valuation.

ah ... no ... that's nonsense trying to hide behind stilted math lingo.


> - Better Linux drivers than AMD

Unless something radically changed in the last couple years, I am not sure where you got this from? (I am specifically talking about GPUs for computer usage rather than training/inference)


> Unless something radically changed in the last couple years, I am not sure where you got this from?

This was the first thing that stuck out to me when I skimmed the article, and the reason I decided to invest the time reading it all. I can tell the author knows his shit and isn't just parroting everyone's praise for AMD Linux drivers.

> (I am specifically talking about GPUs for computer usage rather than training/inference)

Same here. I suffered through the Vega 64 after everyone said how great it is. So many AMD-specific driver bugs, AMD driver devs not wanting to fix them for non-technical reasons, so many hard-locks when using less popular software.

The only complaints about Nvidia drivers I found were "it's proprietary" and "you have to rebuild the modules when you update the kernel" or "doesn't work with wayland".

I'd hesitate to ever touch an AMD GPU again after my experience with it, haven't had a single hick-up for years after switching to Nvidia.


Another ding against Nvidia for Linux desktop use is that only some distributions either make it easy to install and keep the proprietary drivers updated (e.g. Ubuntu) and/or ship variants with the proprietary drivers preinstalled (Mint, Pop!_OS, etc).

This isn’t a barrier for Linux veterans but it adds significant resistance for part-time users, even those that are technically inclined, compared to the “it just works” experience one gets with an Intel/AMD GPU under just about every Linux distro.


Wayland was a requirement for me. I've used an AMD GPU for years. I had a bug exactly once with a linux update. But has been stable since.


Wayland doesn't matter in the server space though.


they are, unless you get distracted by things like licensing and out of tree drivers and binary blobs. If you'd rather pontificate about open source philosophy and rights than get stuff done, go right ahead.


Check out Anthonix on Twitter. He's already done what George Hotz is trying to do and he did it months ago. He's moved on from the RX 7900 XTX to MI300X and is setting some records. He had to write the majority of the code by himself but kept some of ROCm he deemed fit. He is always stirring George up when he has his AMD tantrums. Seriously though, how bad are AMD engineers if one person in their free time can make a custom stack that out performs ROCm.


The unification of the flaws is the scarcity of H100s

He says this and talks about it in The Fallout section - even at BigCos with megabucks the teams are starved for time on the Nvidia chips and if these innovations work other teams will use them and then boom Nvidia's moat is truncated somehow which doesn't look good at such lofty multiples


Sorry, I don’t know who George Hotz is, but why isn’t AMD making better drivers for AMD?


George Hotz is a hot Internet celebrity that has basically accomplished nothing of value but has a large cult following. You can safely ignore.

(Famous for hacking the PS3–except he just took credit for a separate group’s work. And for making a self-driving car in his garage—except oh wait that didn’t happen either.)


He took an “internship” at Twitter/X with the stated goal of removing the login wall, apparently failing to realize that the wall was a deliberate product decision, not a technical challenge. Now the X login wall is more intrusive than ever.


He was famous before the PS3 hack, he was the first person to unlock the original iPhone.


Yes, but it's worth mentioning that the break consisted of opening up the phone and soldering on a bypass for the carrier card locking logic. That certainly required some skills to do, but is not an attack Apple was defending against. This unlocking break didn't really lead to anything, and was unlike the later software unlocking methods that could be widely deployed.


Well he also found novel exploits in multiple later iPhone hardware/software models and implemented complete jailbreak applications.


You’re not wrong, but after all these years it’s fair to give benefit of the doubt - geohot may have grown as a person. The PS3 affair was incredibly disappointing.


Given the number of times he has been on the news for bombastic claims he doesn’t follow through on, I don’t think we need to guess. He hasn’t changed.


Comma.ai works really well. I use it every day in my car.


What about comma.ai?


He promised Waymo.


What specifically is in comma.ai that makes it less technically impressive? Comma.ai looks like epic engineering to me. I haven't made any self driving cars.

Why do you think otherwise? Can you share specific details?


> - Better Linux drivers than AMD

In which way? As a user who switched from an AMD-GPU to Nvidia-GPU, I can only report a continued amount of problems with NVIDIAs proprietary driver, and none with AMD. Is this maybe about the open source-drivers or usage for AI?


George is writing software to directly talk to consumer AMD hardware, so that he can sell more Tinyboxes. He won't be doing that for enterprise.

Cerbras and Groq need to solve the memory problem. They can't scale without adding 10x the hardware.


> George Hotz is making better drivers for AMD

lol


*George Hotz is making posts online talking about how AMD isn’t helping him


George Hotz tried to extort AMD into giving him $500k in free hardware and $2m cash, and they politely declined.


Was arguably not that polite and caused them some bad PR IMHO


You have to know the history and a bit of inside rumors to understand what was really going on.

What came out of it (and the semianalysis article) was that Anush would step up to the plate and work on improving the software.

George making noise is just a momentary blip in time that will be forgotten a week later…


A new entrant, with an order of magnitude advantage in e.g. cost or availability or exportability, can succeed even with poor drivers and no CUDA etc. Its only when you cost nearly as much as NVidia that the tooling costs become relevant.


Don't forget they bought Mellanox and have their own HBA and switch business.


There is not enough water (to cool data centers) to justify NVDA's current valuation.

The same is true of electricity - neither nuclear power nor fusion will not be online anytime soon.


Those are definitely not the limiting factors here.

Not nearly all data centers are water cooled, and there is this amazing technology that can convert sunlight into electricity in a relatively straightforward way.

AI workloads (at least training) are just about as geographically distributeable as it gets due to not being very latency-sensitive, and even if you can't obtain sufficient grid interconnection or buffer storage, you can always leave them idle at night.


Right - they are not limiting factors, they are reasons that NVDA is overvalued.

Stock price is based on future earnings.

The smart money knows this and is reacting this morning - thus the drop in NVDA.


Solar microgrids are cheaper and faster than nuclear. New nuclear isn't happening on the timescales that matter, even assuming significant deregulation.


Can you back up that solar microgrids will supply enough power to justify NVDA's current valuation?


Well, prediction is very difficult, especially with respect to the future. But the fundamentals look good.

Current world marketed energy consumption is about 18 terawatts. Current mainstream solar panels are 21% efficient. At this efficiency, the terrestrial solar resource is about 37000 terawatts, 2000 times larger than the entire human economy:

    ~ $ units
    Currency exchange rates from exchangerate-api.com (USD base) on 2024-11-25
    Consumer price index data from US BLS, 2024-11-24
    7290 units, 125 prefixes, 169 nonlinear units

    You have: 21% solarirradiance circlearea(earthradius)
    You want: TW
            * 36531.475
            / 2.7373655e-05
IEA reports that currently (three years ago) datacenters used 460TWh/year. In SI units, that's 0.05 terawatts. https://iea.blob.core.windows.net/assets/6b2fd954-2017-408e-...

So, once datacenters are using seven hundred thousand times more power than currently, we might need to seek power sources for them other than terrestrial solar panels running microgrids. Solar panels in space, for example.

You could be forgiven for wondering why this enormous resource has taken so long to tap into and why the power grid is still largely fossil-fuel-powered. The answer is that building fossil fuel plants only costs on the order of US$1–4 per watt (either nameplate or average), and until the last few years, solar panels cost so much more than that that even free "fuel" wasn't enough to make them economically competitive. See https://www.eia.gov/analysis/studies/powerplants/capitalcost... for example.

Today, however, solar panels cost US$0.10 per peak watt, which works out to about US$0.35 to US$1 per average watt, depending largely on latitude. This is 25% lower than the price of even a year ago and a third of the price of two years ago. https://www.solarserver.de/photovoltaik-preis-pv-modul-preis...


Geohot still at it?

goat.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: