I'm half joking but if this AI boom continues we're going to see Nvidia exit from consumer GPU business. But Jensen Huang will never do that to us... (I hope)
on the contrary. this is the place they can try out new tech, new cores, new drivers, new everything with very little risk. driver crash? the gamer will just restart their game. the AI workload will stall and cost a lot of money.
basically, the gaming segment is the beta-test ground for the datacenter segment. and you have beta testers eager to pay high prices!
we see the same in CPUs by the way, where the datacenter lineup of both intel and amd lags behind the consumer lineup. gives time to iron out bios, microcode and optimizations.
Why would anyone sell a handful of GPUs to nobodies like us when they could sell a million GPUs for thousands apiece to a handful of big companies? We're speedrunning the absolute worst corpo cyberpunk timeline.
Might almost be a good thing, if it means abandoning overhyped/underperforming high-end game rendering tech, and taking things in a different direction.
The push for 4K with raytracing hasn't been a good thing, as it's pushed hardware costs way up and led to the attempts to fake it with AI upscaling and 'fake frames'. And even before that, the increased reliance on temporal antialiasing was becoming problematic.
The last decade or so of hardware/tech advances haven't really improved the games.
DLSS Transformer models are pretty good. Framegen can be useful but has niche applications dure to latency increase and artifacts. Global illumination can be amazing but also pretty niche as it's very expensive and comes with artifacts.
Biggest flop is UE5 and it's lumen/nanite. Reallly everything would be fine if not that crap.
And yeah, our hardware is not capable of proper raytracing at the moment.
The push for ray tracing comes from the fact that they've reached the practical limits of scaling more conventional rendering. RT performance is where we are seeing the most gen-on-gen performance improvement, across GPU vendors.
Poor RT performance is more a developer skill issue than a problem with the tech. We've had games like Doom The Dark Ages that flat out require RT, but the RT lighting pass only accounts for ~13% of frame times while pushing much better results than any raster GI solution would do with the same budget.
The literal multi-million dollar question that executives have never bothered asking: When is it enough?
Do I, as a player, appreciate the extra visual detail in new games? Sure, most of the time.
But, if you asked me what I enjoy playing more 80% of the time? I'd pull out a list of 10+ year old titles that I keep coming back to, and more that I would rather play than what's on the market today if they only had an active playerbase (for multiplayer titles).
Honestly, I know I'm not alone in saying this: I'd rather we had more games focused on good mechanics and story, instead of visually impressive works that pile on MTX to recoup insane production costs. Maybe this is just the catalyst we need to get studios to redirect budgets to making games fun instead of spending a bunch of budget on visual quality.
Well in the case of Doom: The Dark ages, it's not just about about fidelity but about scale and production. To make TDA's levels with the baked GI used in the previous game would have taken their artists considerably more time and resulted in a 2-3x growth in install size, all while providing lighting that is less dynamic. The only benefit would have been the ability to support a handful of GPUs slightly older than the listed minimum spec.
Ray tracing has real implications not just for the production pipeline, but the kind of environments designers can make for their games. You really only notice the benefits in games that are built from the ground up for it though. So far, most games with ray tracing have just tacked it on top of a game built for raster lighting, which means they are still built around those limitations.
I'm not even talking about RT, specifically, but overall production quality. Increased texture detail, higher-poly models, more shader effects, general environmental detail, the list goes on.
These massive production budgets for huge, visually detailed games, are causing publishers to take fewer creative risks, and when products inevitably fail in the market the studios get shuttered. I'd much rather go back to smaller teams, and more reasonable production values from 10+ years ago than keep getting the drivel we have, and that's without even factoring in how expensive current hardware is.
I can definitely agree with that. AAA game production has become bloated with out of control budgets and protracted development cycles, a lot of that due to needing to fill massive overbuilt game worlds with an endless supply of unique high quality assets.
Ray tracing is a hardware feature that can help cut down on a chunk of that bloat, but only when developers can rely on it as a baseline.
They are already making moves that might suggest that future. They are going to stop packaging VRAM with their GPUs shipped to third-party graphics card makers, who will have to source their own, probably at higher cost.
They will constrain supply before exiting. It's just not smart exiting, you can stop developing and it will be a trickle, also will work as insurance in case AI flops.
Honestly, I'd prefer it. It might get AMD and Intel more off their ass for GPU development. I already stopped buying Nvidia gpus ages ago before they saw value in the Linux/Unix market, and I'm tired of them sucking up all the air in the room.
They did get burned when crypto switched to dedicated hardware and nvida were left with for them huge surpluses of 10xx series hardware. But what they’re selling to AI companies now is a lot more different from their consumer gear