Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

One of the benefits of GeForce now is im pretty sure they are subsidizing it. I get unlimited gaming on a 4080 for USD$20.

I have a 4080 at home and I can't get the same latency/frame rate with Parsec even though I have a gigabit connection.

Cool project, actual experience unsure.



> One of the benefits of GeForce now is im pretty sure they are subsidizing it.

NVidia is getting the cost advantages of being the manufacturer.

Renting out rack mounted gamer PCs costs more than users are willing to pay for it. Users pay by the month, but server costs are per hour of use. As a business, this only works if users don't use it much, like a gym. Or users face severe limits on how much time they can use. Most of the early "cloud gaming" companies are gone, because the business model didn't work for them.

For a sense of what it really costs, see Shadow PC. They charge $30 - $40 per month, after the first month intro price.


if you sell globally, then you can get max utilization from different sides of the globe as people sleep, so that you can over-subscribe per unit of the GPU. I would expect that people dont play 24/7, but if you charge like they do, then you get to make a profit.


That doesn't work due to the latency. If you want 30ms roundtrip, you need to be within ~3000km/1800 miles of the subscriber.


What if the gaming PCs were in orbit? You could have the orbit of the satellites track the sleeping times of populations, so that you don't have PCs above areas where nobody would use them.

Starlink promises latency around 20-40ms, so I do wonder if this is possible. You would have far more distance between the client/server, but I would expect that routing would be greatly simplified.


Starlink gets that latency by orbiting close to the Earth. At 525km, a Starlink bird completes one orbit in 95 minutes. For a twelve hour demand response orbit they'd have to be around 20000km high, more than one Earth diameter away, giving a far worse speed of light delay than just talking to a server one continent away.


Powering and cooling tens, hundreds of gaming PCs in orbit is beyond infeasible right now. RTGs can't produce that much power even if we had the plutonium. Solar panels are too big and heavy and don't work on the dark side of the planet. Batteries to complement the solar would also be prohibitively heavy.


Oh, I'm sure there are plenty of hurdles, I was just wondering if the latency problem could by solved by having the computers in orbit.


I guess all of these hurdles are solvable, but not the economic one.

These GPUs are not valuable enough to timeshare via orbiting satellites!


You're right, but it would be really cool!


They could rent them out for GPU compute jobs, no need to even worry about the license restrictions in NV's case.


Also one does have to consider energy! In Germany energy prices are currently at 0.328€/kWh from big providers, and considering a RTX 4080 has 320W alone an say 230W extra for the rest of the system + PSU inefficiencies - my M1 MacBook has way under 30W so the difference between these 2 is 520W.. what means that alone for energy consumption you need to game about 129 hours/month or 4.3/hours per day to break even ... _but_ this is before having to buy a 1300€ GPU, that one would have to replace every year to have the same as GFN.

But IMHO the biggest appeal to GFN in Germany is temperature. In Germany there are no air conditioners in normal households and even in apartments with more then 2k a month rent there are none. So it is quite luxurious to game while not having 0.5kW blasting into your room.


Interesting that electricity prices are so different than in southern Sweden even though the distance to Germany is less than 100 km and our grids are somewhat connected. Here in southern Sweden the cost has been negative during daytime for the past week, a trend that will probably continue over summer thanks to solar energy. The net cost for buying electricity is still a bit over 0 though, because of taxes and transfer fees.


There's only 600MW of direct grid connection between Sweden and Germany[1]. Southern Sweden uses 4GW right now, Germany 47GW.

[1]: https://en.wikipedia.org/wiki/Baltic_Cable


SOUTHERN? Sweden? We have the worst electric prices within Sweden. (Malmö at least), the connections to the north where there is a lot of power generation is really weak and they shut down nearly all the power plants in the region, so we are stuck importing expensive energy.

My building has solar on the roof and a collective agreement with e.on precisely to try to control the insane costs.


Prices were high two years ago, but that is no longer the case. A lot of solar power came online last year. Currently the price is negative:

https://www.eon.se/el/elpriser/aktuella


And then locals dont want wind power.


In Finland locals started wanting wind power when it became clear a wind power park can bring considerable tax income for a small municipality.


> considering an RTX 4080 has 320W alone

> having to buy a 1300€ GPU, that one would have to replace every year

Feels like gamers are quite the demanding audience, instead of getting something like Intel Arc A380 (or something that supports AV1), upscaling from 720p/1080p using framegen and running games on medium settings. That might make the equation work out better, though perhaps not if the industry pushes for more and more complex graphics all the darn time, while the engines themselves are capable of scaling back all the way to mobile devices.

I probably have lower standards in that regard, I wouldn't expect to max out graphics when trying to play games on a MacBook or a netbook or something, due to the smaller screen anyways, it feels like modern game graphics are so noisy you can hardly make things out well. Nowadays I mostly play indie games that aren't super graphically complex, yet still are lovely experiences.

That said, my current GPU is actually an Arc A580 (replaced my old RX 580) and it's been pretty good since I got ReBAR working, and 1080p at 60 fps is the sweet spot for me (except slightly higher framerates like 72 fps feel better in VR).


> upscaling from 720p/1080p using framegen

This isn't something you can just do. I mean, it almost is (look at apps like Lossless Scaling, Magpie) but it's always most optimally implemented by the game itself.

Just because the buzzword exists doesn't mean the problem is solved.


From everything I've heard and seen, it seems like XeSS/FSR/DLSS are all making steady progress and the internal rendering resolutions being lower than what's on your screen is no longer such an issue, with more and more improvements being made.

There's no reason why you couldn't have the game be upscaled to whatever the stream resolution is from something a bit more performance friendly, with the quality still being good enough (given that some video artifacts will be there anyways) and things like text/UI being legible.

I'm regards to frame generation, it seems that eventually whatever Nvidia is doing with DLSS 3 will find its way into other vendors' products in a comparable alternative form. Once game engines like Unity/Unreal/Godot get support for the upscalers out of the box, I think the technology will be even more commonplace.


> From everything I've heard and seen, it seems like XeSS/FSR/DLSS are all making steady progress and the internal rendering resolutions being lower than what's on your screen is no longer such an issue, with more and more improvements being made.

That's nice, but doesn't really fix the issue of them not being available in every game, or it varying which one even is available.

> There's no reason why you couldn't have the game be upscaled to whatever the stream resolution is from something a bit more performance friendly, with the quality still being good enough (given that some video artifacts will be there anyways) and things like text/UI being legible.

If you're just going to upscale the final image without any internal cooperation from the game, there is no reason to upscale a 720p game and stream 1080p when you could simply stream 720p instead and have the client do the upscaling, unless of course you are using an upscaler that chokes on video artifacts (in which case you probably shouldn't even have video artifacts, it's 2024 and H265/AV1 exist).

> Once game engines like Unity/Unreal/Godot get support for the upscalers out of the box, I think the technology will be even more commonplace.

Don't they all already support it out of the box? It just needs cooperation from the game dev, like I said.

For example, Pacific Drive uses UE4, not even UE5, and still supports DLSS. It almost works fine, except displays (such as the ones in the car and garage) flicker and shimmer under DLSS because the devs didn't implement them in a way that works with the upscaler.

This can't just be solved by the engine, since DLSS integrates multiple different views of the game (depth buffer, motion buffer, etc) and the dev needs to make sure all of these views are faithful to the effects they're actually showing on-screen, else bugs like that will just happen.

That specific problem could be solved by running the game at native-720 and then using a naïve upscaler such as FSR on the output framebuffer, but that's going to be a significant visual tradeoff compared to if DLSS were implemented correctly. Upscaling isn't magic, but FSR (upscaling the final render output) is even less magic than DLSS (upscaling multiple intermediate artifacts).

And games often omit settings from their menus, even those that work perfectly well in the engine itself. Having upscaling implemented by the engine doesn't necessarily mean the dev cares enough to implement the setting for it and test it.


> having to buy a 1300€ GPU, that one would have to replace every year

The flagships usually get refreshed every 2+ years, not yearly. If you're switching yearly, then every second change is gonna be a performance downgrade.

Furthermore, the flagship of last generation is usually at least on par with the card below the new flagship, so you'd have top of the line performance for at least 2 generations, which is 3-5 years. The 4080 specifically has worse performance then a 3090 unless you're using dlss/upsampling. Which most people on high budget systems don't want as it increases latency.

Your napkin math wrt the energy doesn't really make sense either, your card only draws so much power if it's actually on full load. My 4090 is usually mostly idle.

My MacBook pro actually draws more then my desktop PC with a 4090 if both are idle (that's because the MacBook has an integrated display which offsets the higher power draw of the desktop components). At least according to the measurements of my smart plugs that measure energy drain at the plug/wall

Finally, there is no way in hell a working adult will find time to play 4+hours day, every day.

Overall I'd say you've no idea what you're talking about and are just rationalizing and in denial. The performance you get from cloud gaming providers is terrible, worse then a budget PC which costs 1k in it's entirely. And you'll still have it at the end, while the cloud gaming subscriber will have nothing after the sub ends.


GeForce Now works great. I subscribed for a while to play Cyberpunk 2077 on a remote 4080 and it felt like playing on a local computer. But with much higher settings and framerate than I get on my local GeForce 1080.

> Finally, there is no way in hell a working adult will find time to play 4+hours day, every day.

Don't make the assumption that everyone live their lives exactly like you do.


A 1080 is 8 years old and wasn't a flagship, an equivalent card costs ~$100 currently... Also: that's 4 generations, not 2.

It'd be a complete disaster if gfn couldn't beat that performance.

I've attempted to use GFN several times too and can only conclude that you've never played at a decent setup if that felt like a local machine.

The input latency is so high that anything with pvp is entirely unplayable (it adds at least 40-80ms).

Maybe the only thing you're paying is Scenematic single player? In that case, sure. It works well enough. Not worth the subscription as you'd get better performance and experience from a steam deck, but to each their own.


I have and love a steam deck, and used to have geforce now for about 18 months before I got a 3080 new rig. We might be playing incredibly different games in very different ways, and that's ok - but geforce now being beaten by steam deck is, again, taking a reasonable position way too far.

Geforce now lets me play cp2077 on ultra with all the ray tracing on high resolution 32" monitor. Steam deck can "play" cp2077 at 800 resolution if you don't mind a melting jet engine in your hands. They're both great for what they do and are, and they can make a powerful combo, but in terms of performance I don't see how they can possibly be compared :-/

Edit : did you by any chance only try the free / low tier version of geforce now?


No, Nvidia always gives out a 3 month free trial with every purchase. I tried it out when I bought a 3080 when it was released and a 4090 last year.

> cp2077

That's indeed a game that works well with GFN, witcher 3 and similar games too. Pretty sure you're still playing at 720p on GFN though, they're just using dlss for upsampling. You should be able to see the upsampled resolution in the settings of cyberpunk if I remember correctly. My sub ran out however, so no checking anymore


Thanks for reply!

Fwiw - Cp2077 ran at full res internally. I've played it both with and without dlss on GFNow. I believe when I started it was 2080 for first paid tier and 3080 for the top paid tier. Now the top paid tier is 4080.


It’s great for turn based like BG3 or cities skyline. My biggest issue with it is how long it takes to start up a session


> Your napkin math wrt the energy doesn't really make sense either, your card only draws so much power if it's actually on full load.

Not only that, but there's evidence that more powerful cards can actually perform the same workloads as less powerful cards more efficiently. Of course, it's not going to be more efficient if you let it run at full tilt, but if, for example, you lower the power limit of a 3090 to 200W, it's still going to blow a 3060 Ti (also 200W) out of the water in terms of performance. Higher performance for the same power means you can also get the same performance at a lower power.

> Finally, there is no way in hell a working adult will find time to play 4+hours day, every day.

I work full-time and still get the hours of 5 PM to 12 AM to myself, which is 7 hours, and then I can still get 8 hours of sleep


I agreed with everything up to the "performance of cloud gaming is terrible " - I'm an extremely happy geforce now user. I have gigabit internet and use it with 27" monitor, historically for performance reasons, now that I got a 3080 for occasional convenience reasons (I have half a dozen Intel NUC around house, it's usually simpler and more reliable to stream game from geforce now servers thousand of km away than via steam stream from my upstairs pc - go figure)

Now! Neither one of us have spoken our assumptions out loud so let's do that - I play mostly single player games. There are people who mostly play pvp, cannot live without a 250hz screen and million dpi mouse, and those users likely won't enjoy cloud gaming. That does not mean "cloud gaming is terrible" overall but it may well be that "cloud gaming is terrible for high demanding pvp competitive usage " :).


.328 and they are complaining! Bah!


May I ask what benefit do you get from Geforce now when you already have a 4080 at home?


Noise/heat/electricity cost/not infesting pc with kernel level anticheat


What assumptions make it less than profitable at $20? If the hw costs are $1000 and if you'd need a hw setup per every 5 subscribers, you'd get $2400 in 2 years for $1000 in hw investment. Power, bandwidth and other costs are something but doesn't sound obviously unprofitable to me.


The hardware could be used for other things I think 1 4080 card tiers also 2 1080 cards equivalent. I wound bet they rent out capacity for other stuff during the day too.

I would also be surprised if they were only 5-1 on users still it’s probably not their most profitable venture.


First, you take USD4-5 off that in tax. Then more for support, finance and and marketing. Then electricity, bandwidth and insurance. Then games licences. Then hw...


Corporate tax is a flat 21%, and that's on profit, not revenue. I don't believe Nvidia pays anything for "licenses" either. There's no way that overhead should make them unprofitable even at retail prices.


I was thinking of sales tax / VAT.


I get better latency and frame rate than GeForce Now on a 2660 using moonlight. Using h.265 or AV1 is necessary. I'm not familiar with Presec, but I'll bet it's defaulting to h.264.


The default for Parsec is h.265, it's only on older machines that can't hardware encode/decode that fall back to h.264.

The reason they're likely seeing latency issues is that Parsec doesn't have the localized servers that GeForce Now has. Servers closer to your location will always have lower latency.

This is a simple case of nVidia simply having more money, more servers, and better infrastructure than Parsec. Which isn't surprising given the scale and cashflow difference of the two companies.


Parsec doesn’t use their servers though. It’s peer to peer once discovery is done. So the “server” is likely elsewhere in your house. That’s basically as close as you can get.


Oh, wild, I didn't know that. I definitely see some high latency on Parsec within my home.

It might be bouncing around through the net somehow instead of transferring directly within my local network? I wonder if something about local peer-to-peer discovery isn't working properly?

Maybe that's also happening to the other person comparing GeForce Now to Parsec?


I've been using Parsec on my local network for several months now and I see absolutely no latency whatsoever. GFN and any streaming provider gives me a very small but noticeable lag in shooters, but Parsec on my home network is genuinely perfect. I game from my Macbook and I play whatever I want from my homeserver (3090 and 4060) in another room. Both my Macbook and computer are wired of course. Important to note is that I needed a dummy HDMI on my computer for a 'true' screen, any kind of virtual screened caused it to lag hard, probably because the game or stream ran without any kind of hardware acceleration.

I have to say, I love the way Parsec works and allows you to use your whole library (since it doesn't stream the game, but it streams your whole desktop). But if there is ever a service that allows you to get the same thing including remote servers (since I travel), I would probably switch.


Appreciate the details you’ve provided!

I’m going to do some more Parsec experiments this weekend thanks to your info.

Something is clearly wrong with my setup, I’m going to figure it out.


I'm not a professional myself either, but the things I would check are:

- Make sure you run hardware acceleration, you can see if this is applied after opening a connection and checking the details. Without acceleration it will be terrible. This can either be as simple as enabling it or possibly you need the monitor below.

- Make sure a monitor is plugged in (there might be a way around this, but I just bought a $4 HDMI dummy instead of trying to figure out how to do this the software way)

Edit: formatting


In my experience if both sides are wireless the connection quality is terrible, no matter how good the signal is.


Yeah, everything here is 2.5G ethernet end-to-end. I don't think that's the issue. :)


I also enjoy Parsec at home (and for non gaming remote desktop). For some reason outside of home (I'm travelling to a city 10ms away) it's not as good


Peer-to-peer across networks is impossible unless you can punch a hole, i.e. UPnP. It's possible this isn't enabled and that could cause Parsec to have to fallback to a central server.


Why don't you use zerotier / tailscale


Because those aren't peer-to-peer in reality. They're peer-to-peer in terms of a virtual network that's often (not necessarily, but certainly in this case) implemented using a central server anyway.


Those are peer to peer. The coordination server manages the public keys, and exchanges the ip addresses initially. After that peers connect directly.

Relays are involved only briefly initially.


How is Tailscale not P2P? Just because there's a central server for discovery and key exchange does not somehow make it not peer-to-peer.

If the actual communications between two clients is P2P then, by definition, the VPN is P2P. Because that's the actual part people are talking about when they talk about VPNs.


I mean, sure, but that doesn't mean that Parsec uses central servers by default, which was the entire point of the GP comment; that their servers are worse than Nvidia's.


Parsec uses whatever server you tell it to.

Parsec will discover servers on LAN and/or with a port open and display them as distinct entries in the connection list, usually positioned in front of the central server options.

So there's no default as you're always the one choosing.


What? No. That's not how Parsec works at all. Are you talking about the same app?

Parsec displays computers to connect to. If a hole can be punched, it will use a P2P connection. If not, it automatically falls back to their STUN servers or the Parsec Relay server if you're on the Teams plan. There is no "server" to choose from. There aren't multiple entries for a single computer with different connection options. I don't know where you hallucinated all of these things but that's just not how Parsec works.

This article makes it quite clear that P2P is the default with a fallback to STUN: https://support.parsec.app/hc/en-us/articles/4440911373325-P...


> I don't know where you hallucinated all of these things but that's just not how Parsec works.

I opened the app and looked at it. I distinctly remember there being multiple entries if a computer is on the local network. I don't have another computer on the network right now so I can't check.


I'm not sure you would use h265 for "better latency" as the encoding overhead is greater than that compared to h264. Sure, h265 requires less bandwidth but it also is a slower encoder. AV1 is supposedly a fast encoder but hardware to encode it is limited to the recent generation.

Capture method can have a big effect on latency and technologies like the Desktop Duplication API or NV FBC are also necessary for achieving lower latencies by minimising data transfer between cpu and the gpu when capturing and encoding.


I’m getting a stable 5ms latency over ethernet with Sunshine + Moonlight (forced HW en & decoding). WIFI adds a few ms and you need a good router & mostly idle access point for perfectly smooth 4K. For FPS like L4D2 I do get better results (aim) running locally, but I ascribe that to the mouse acceleration weirdness which I never managed to configure _just right_.


This is my experience, 4-6ms. Even over the internet on a different continent I can game from my home PC with negligible latency with Moonlight.


Does the higher price tier work better than the low tier? I Usally get wierd stutter and resolution/packet loss during peak times or network congestion at the nvidia end. It’s typically related to new game or game season releases so I think it’s in nvidias end.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: