Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm half joking but if this AI boom continues we're going to see Nvidia exit from consumer GPU business. But Jensen Huang will never do that to us... (I hope)




There is a couple reasons why Jensen won't take off the gaming leather jacket just yet:

1. Gaming cards are their R&D pipeline for data center cards. Lots of innovation came from gaming cards.

2. Its a market defense to keep other players down and keep them from growing their way into data centers.

3. Its profitable (probably the main reason but boring)

4. Hedge against data center volatility (10 key customers vs millions)

5. Antitrust defense (which they used when they tried to buy ARM)


6. Techies who use NVidia GPUs in their PCs are more likely to play with AI and ultimately contribute to the space as either a developer or a user

7. Maybe just don’t put all your eggs in one basket, especially when that basket is an industry that has yet to materialize its promise.

They'll access GPUs through their company VPN

If they're unemployed, they'll just rent from the cloud

How many of you still manage your own home server?


> 1. Gaming cards are their R&D pipeline for data center cards. Lots of innovation came from gaming cards.

No way that is true any more. Five years ago, maybe.

https://www.reddit.com/r/pcmasterrace/comments/1izlt9w/nvidi...


on the contrary. this is the place they can try out new tech, new cores, new drivers, new everything with very little risk. driver crash? the gamer will just restart their game. the AI workload will stall and cost a lot of money.

basically, the gaming segment is the beta-test ground for the datacenter segment. and you have beta testers eager to pay high prices!

we see the same in CPUs by the way, where the datacenter lineup of both intel and amd lags behind the consumer lineup. gives time to iron out bios, microcode and optimizations.


Why would anyone sell a handful of GPUs to nobodies like us when they could sell a million GPUs for thousands apiece to a handful of big companies? We're speedrunning the absolute worst corpo cyberpunk timeline.

Because when you lose even one of those big companies in your handful, it tanks your business. Customer diversity is a good thing.

And they're not selling a handful of GPUs to nobodies like us; they're selling millions of GPUs to millions of nobodies.


Gaming is now less than 10% of nvidia's revenue. We're really not adding any meaningful diversity to their bottom line anymore.

> Customer diversity is a good thing.

Tell that to Micron.


The way things are going no one will be able to afford a PC.

Instead we will be streaming games from our locked down tablets and paying a monthly subscription for the pleasure.


You will own nothing and be happy.

Might almost be a good thing, if it means abandoning overhyped/underperforming high-end game rendering tech, and taking things in a different direction.

The push for 4K with raytracing hasn't been a good thing, as it's pushed hardware costs way up and led to the attempts to fake it with AI upscaling and 'fake frames'. And even before that, the increased reliance on temporal antialiasing was becoming problematic.

The last decade or so of hardware/tech advances haven't really improved the games.


DLSS Transformer models are pretty good. Framegen can be useful but has niche applications dure to latency increase and artifacts. Global illumination can be amazing but also pretty niche as it's very expensive and comes with artifacts.

Biggest flop is UE5 and it's lumen/nanite. Reallly everything would be fine if not that crap.

And yeah, our hardware is not capable of proper raytracing at the moment.


> Framegen can be useful but has niche application

Somebody should tell that to the AAA game developers that think hitting 60fps with framegen should be their main framerate target.


The latest DLSS and FSR are good actually. Maybe XeSS too.

The push for ray tracing comes from the fact that they've reached the practical limits of scaling more conventional rendering. RT performance is where we are seeing the most gen-on-gen performance improvement, across GPU vendors.

Poor RT performance is more a developer skill issue than a problem with the tech. We've had games like Doom The Dark Ages that flat out require RT, but the RT lighting pass only accounts for ~13% of frame times while pushing much better results than any raster GI solution would do with the same budget.


The literal multi-million dollar question that executives have never bothered asking: When is it enough?

Do I, as a player, appreciate the extra visual detail in new games? Sure, most of the time.

But, if you asked me what I enjoy playing more 80% of the time? I'd pull out a list of 10+ year old titles that I keep coming back to, and more that I would rather play than what's on the market today if they only had an active playerbase (for multiplayer titles).

Honestly, I know I'm not alone in saying this: I'd rather we had more games focused on good mechanics and story, instead of visually impressive works that pile on MTX to recoup insane production costs. Maybe this is just the catalyst we need to get studios to redirect budgets to making games fun instead of spending a bunch of budget on visual quality.


Well in the case of Doom: The Dark ages, it's not just about about fidelity but about scale and production. To make TDA's levels with the baked GI used in the previous game would have taken their artists considerably more time and resulted in a 2-3x growth in install size, all while providing lighting that is less dynamic. The only benefit would have been the ability to support a handful of GPUs slightly older than the listed minimum spec.

Ray tracing has real implications not just for the production pipeline, but the kind of environments designers can make for their games. You really only notice the benefits in games that are built from the ground up for it though. So far, most games with ray tracing have just tacked it on top of a game built for raster lighting, which means they are still built around those limitations.


I'm not even talking about RT, specifically, but overall production quality. Increased texture detail, higher-poly models, more shader effects, general environmental detail, the list goes on.

These massive production budgets for huge, visually detailed games, are causing publishers to take fewer creative risks, and when products inevitably fail in the market the studios get shuttered. I'd much rather go back to smaller teams, and more reasonable production values from 10+ years ago than keep getting the drivel we have, and that's without even factoring in how expensive current hardware is.


I can definitely agree with that. AAA game production has become bloated with out of control budgets and protracted development cycles, a lot of that due to needing to fill massive overbuilt game worlds with an endless supply of unique high quality assets.

Ray tracing is a hardware feature that can help cut down on a chunk of that bloat, but only when developers can rely on it as a baseline.


I think Nvidia realises that selling GPUs to individuals is useful as it allows them to develop locally with CUDA.

This is a huge reason.

They are already making moves that might suggest that future. They are going to stop packaging VRAM with their GPUs shipped to third-party graphics card makers, who will have to source their own, probably at higher cost.

They will constrain supply before exiting. It's just not smart exiting, you can stop developing and it will be a trickle, also will work as insurance in case AI flops.

In the words of Douglas Adams, there are those who say that this has already happened.

Honestly, I'd prefer it. It might get AMD and Intel more off their ass for GPU development. I already stopped buying Nvidia gpus ages ago before they saw value in the Linux/Unix market, and I'm tired of them sucking up all the air in the room.

Intel GPUs are probably not going to last much longer, considering they did a deal with nvidia for integrated GPUs.

Jensen is to paranoid to do it. But whoever comes after him will do it ASAP.

They did get burned when crypto switched to dedicated hardware and nvida were left with for them huge surpluses of 10xx series hardware. But what they’re selling to AI companies now is a lot more different from their consumer gear

Keep the retail investors happy so they keep pumping your stock.

Wonder if Google will ever start selling TPUs.



I was thinking large ones, to other AI companies.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: