Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Show HN: Open-source GeForce NOW alternative with Stadia's social features (github.com/netrisdotme)
176 points by WanjohiRyan on May 19, 2024 | hide | past | favorite | 108 comments


I was trying to find their actual streaming implementation in their codebase before giving up and checking their Discord. Apparently parts of their game streaming stack are being developed in private repos, and their subscription service is the only working version of this project.

In its current state I wouldn’t call this project open-source.


You should check out this project (https://kyber.media/) by the man behind VLC and ex CTO of a cloud streaming (JB Kempf). It also uses QUIC for transport and VLC/ffmpeg for fast encoding. Unfortunately it's not launched yet AFAIK.

Brief slide in english : https://conf.kde.org/event/5/contributions/147/attachments/7...

More detailed overview (in french) : https://www.youtube.com/watch?v=EVgvBulXcDA&t=3368s


If you’re interested in stuff like this, take a look at the looking-glass project.

It uses shared memory to display frames from a VM running under a Linux host.

I’m sure there’s a way to transmit those frames over a network if one was clever enough


> I’m sure there’s a way to transmit those frames over a network if one was clever enough

Something like Sunshine[1], coupled with Moonlight[2]?

[1] https://github.com/LizardByte/Sunshine

[2] https://github.com/moonlight-stream/moonlight-qt


That's a really cool idea.


Sigh. That was the main thing I was interested in.


Yeah this is misleading enough to earn a flag from me. Those are the only things that would be terribly interesting from an open source perspective.


I am a GFN subscriber (and Stadia prior to that) so I am quite interested in this - GFN is great (I no longer have a gaming laptop and just use game streaming -90% from GFN and the rest via moonlight from a server in my loft/attic) but the major drawback is that you cannot access your entire game library. I presume it is a combination of a) compatibility but probably more likely b) contracts/business/legal.

GFN does have an integration with Microsoft's game service too, which offers a rolling library of new games to try - if netris does not support anything apart from steam then that will be a shame as I won't be able to access the MS games .

Still despite that, I'm interested to see how this one goes! I am in the sign-up queue and wait with interest.


It's legal. They used to cover pretty much all of it then published came at them one after another to want a "deal". The moment they gain any kind of popularity, they're going to hit the same legal wall.

As for using it for your own needs 1:1, I wish steam would expand it's steam to steam self streaming to cover usage out of local lan.


> I wish steam would expand it's steam to steam self streaming to cover usage out of local lan.

They've done that with their Steam Link apps in 2019: https://www.polygon.com/2019/3/14/18266000/steam-link-anywhe...

The initial pairing has to be done on lan I believe, but will work afterwards through Steam. I remember trying it with a Raspberry Pi even. Steam itself will also work as a client and not require the initial pairing setup.


You can use Steam to stream over the internet. You have to enter a pairing code once on the remote machine, but after that you can play anything, you can even add a "non-steam game" to stream any application you want.

I do this all the time with my powerful work computer at the office while I'm at home. It works surprisingly well, actually.

The only shitty part is that they removed the ability for you to download a new game on the remote machine from your local steam client. The only way to do it now is with the steam app on your phone.


Tell me about the server in your attic. I’m in Florida and so putting tech in my attic seems like a bad idea. Is your attic/loft air conditioned? Do you live in a cold climate?

My RTX 3090 is literally unusable upstairs. I have it in a closet downstairs and that closet gets quite toasty, but I would imagine the Florida attic will take it to junction temp at idle.

I’ve also thought about putting some kind of ventilation system in an upstairs closet that pushes warm air into the attic but that too seems like it could be a bad idea.


I live in London so heat is not really that much of a concern. It can get hot in the summer (30+c is not uncommon) but so far it has not caused any problems, but then I only have medium-range CPU and GPU (i5 something and 1060 or 1070 GPU IIRC - can't remember as I use it so infrequently, most of my time is with a X1 carbon laptop and streaming from GFN so barely use the desktop in the loft and it is sitting idling almost all the time)


Right on, thank you


I never thought I'd find myself subscribing to games streaming but geforce now really is amazing for someone who is trying to live and travel minimally but still wants to try the occasional big budget game.


One of the benefits of GeForce now is im pretty sure they are subsidizing it. I get unlimited gaming on a 4080 for USD$20.

I have a 4080 at home and I can't get the same latency/frame rate with Parsec even though I have a gigabit connection.

Cool project, actual experience unsure.


> One of the benefits of GeForce now is im pretty sure they are subsidizing it.

NVidia is getting the cost advantages of being the manufacturer.

Renting out rack mounted gamer PCs costs more than users are willing to pay for it. Users pay by the month, but server costs are per hour of use. As a business, this only works if users don't use it much, like a gym. Or users face severe limits on how much time they can use. Most of the early "cloud gaming" companies are gone, because the business model didn't work for them.

For a sense of what it really costs, see Shadow PC. They charge $30 - $40 per month, after the first month intro price.


if you sell globally, then you can get max utilization from different sides of the globe as people sleep, so that you can over-subscribe per unit of the GPU. I would expect that people dont play 24/7, but if you charge like they do, then you get to make a profit.


That doesn't work due to the latency. If you want 30ms roundtrip, you need to be within ~3000km/1800 miles of the subscriber.


What if the gaming PCs were in orbit? You could have the orbit of the satellites track the sleeping times of populations, so that you don't have PCs above areas where nobody would use them.

Starlink promises latency around 20-40ms, so I do wonder if this is possible. You would have far more distance between the client/server, but I would expect that routing would be greatly simplified.


Starlink gets that latency by orbiting close to the Earth. At 525km, a Starlink bird completes one orbit in 95 minutes. For a twelve hour demand response orbit they'd have to be around 20000km high, more than one Earth diameter away, giving a far worse speed of light delay than just talking to a server one continent away.


Powering and cooling tens, hundreds of gaming PCs in orbit is beyond infeasible right now. RTGs can't produce that much power even if we had the plutonium. Solar panels are too big and heavy and don't work on the dark side of the planet. Batteries to complement the solar would also be prohibitively heavy.


Oh, I'm sure there are plenty of hurdles, I was just wondering if the latency problem could by solved by having the computers in orbit.


I guess all of these hurdles are solvable, but not the economic one.

These GPUs are not valuable enough to timeshare via orbiting satellites!


You're right, but it would be really cool!


They could rent them out for GPU compute jobs, no need to even worry about the license restrictions in NV's case.


Also one does have to consider energy! In Germany energy prices are currently at 0.328€/kWh from big providers, and considering a RTX 4080 has 320W alone an say 230W extra for the rest of the system + PSU inefficiencies - my M1 MacBook has way under 30W so the difference between these 2 is 520W.. what means that alone for energy consumption you need to game about 129 hours/month or 4.3/hours per day to break even ... _but_ this is before having to buy a 1300€ GPU, that one would have to replace every year to have the same as GFN.

But IMHO the biggest appeal to GFN in Germany is temperature. In Germany there are no air conditioners in normal households and even in apartments with more then 2k a month rent there are none. So it is quite luxurious to game while not having 0.5kW blasting into your room.


Interesting that electricity prices are so different than in southern Sweden even though the distance to Germany is less than 100 km and our grids are somewhat connected. Here in southern Sweden the cost has been negative during daytime for the past week, a trend that will probably continue over summer thanks to solar energy. The net cost for buying electricity is still a bit over 0 though, because of taxes and transfer fees.


There's only 600MW of direct grid connection between Sweden and Germany[1]. Southern Sweden uses 4GW right now, Germany 47GW.

[1]: https://en.wikipedia.org/wiki/Baltic_Cable


SOUTHERN? Sweden? We have the worst electric prices within Sweden. (Malmö at least), the connections to the north where there is a lot of power generation is really weak and they shut down nearly all the power plants in the region, so we are stuck importing expensive energy.

My building has solar on the roof and a collective agreement with e.on precisely to try to control the insane costs.


Prices were high two years ago, but that is no longer the case. A lot of solar power came online last year. Currently the price is negative:

https://www.eon.se/el/elpriser/aktuella


And then locals dont want wind power.


In Finland locals started wanting wind power when it became clear a wind power park can bring considerable tax income for a small municipality.


> considering an RTX 4080 has 320W alone

> having to buy a 1300€ GPU, that one would have to replace every year

Feels like gamers are quite the demanding audience, instead of getting something like Intel Arc A380 (or something that supports AV1), upscaling from 720p/1080p using framegen and running games on medium settings. That might make the equation work out better, though perhaps not if the industry pushes for more and more complex graphics all the darn time, while the engines themselves are capable of scaling back all the way to mobile devices.

I probably have lower standards in that regard, I wouldn't expect to max out graphics when trying to play games on a MacBook or a netbook or something, due to the smaller screen anyways, it feels like modern game graphics are so noisy you can hardly make things out well. Nowadays I mostly play indie games that aren't super graphically complex, yet still are lovely experiences.

That said, my current GPU is actually an Arc A580 (replaced my old RX 580) and it's been pretty good since I got ReBAR working, and 1080p at 60 fps is the sweet spot for me (except slightly higher framerates like 72 fps feel better in VR).


> upscaling from 720p/1080p using framegen

This isn't something you can just do. I mean, it almost is (look at apps like Lossless Scaling, Magpie) but it's always most optimally implemented by the game itself.

Just because the buzzword exists doesn't mean the problem is solved.


From everything I've heard and seen, it seems like XeSS/FSR/DLSS are all making steady progress and the internal rendering resolutions being lower than what's on your screen is no longer such an issue, with more and more improvements being made.

There's no reason why you couldn't have the game be upscaled to whatever the stream resolution is from something a bit more performance friendly, with the quality still being good enough (given that some video artifacts will be there anyways) and things like text/UI being legible.

I'm regards to frame generation, it seems that eventually whatever Nvidia is doing with DLSS 3 will find its way into other vendors' products in a comparable alternative form. Once game engines like Unity/Unreal/Godot get support for the upscalers out of the box, I think the technology will be even more commonplace.


> From everything I've heard and seen, it seems like XeSS/FSR/DLSS are all making steady progress and the internal rendering resolutions being lower than what's on your screen is no longer such an issue, with more and more improvements being made.

That's nice, but doesn't really fix the issue of them not being available in every game, or it varying which one even is available.

> There's no reason why you couldn't have the game be upscaled to whatever the stream resolution is from something a bit more performance friendly, with the quality still being good enough (given that some video artifacts will be there anyways) and things like text/UI being legible.

If you're just going to upscale the final image without any internal cooperation from the game, there is no reason to upscale a 720p game and stream 1080p when you could simply stream 720p instead and have the client do the upscaling, unless of course you are using an upscaler that chokes on video artifacts (in which case you probably shouldn't even have video artifacts, it's 2024 and H265/AV1 exist).

> Once game engines like Unity/Unreal/Godot get support for the upscalers out of the box, I think the technology will be even more commonplace.

Don't they all already support it out of the box? It just needs cooperation from the game dev, like I said.

For example, Pacific Drive uses UE4, not even UE5, and still supports DLSS. It almost works fine, except displays (such as the ones in the car and garage) flicker and shimmer under DLSS because the devs didn't implement them in a way that works with the upscaler.

This can't just be solved by the engine, since DLSS integrates multiple different views of the game (depth buffer, motion buffer, etc) and the dev needs to make sure all of these views are faithful to the effects they're actually showing on-screen, else bugs like that will just happen.

That specific problem could be solved by running the game at native-720 and then using a naïve upscaler such as FSR on the output framebuffer, but that's going to be a significant visual tradeoff compared to if DLSS were implemented correctly. Upscaling isn't magic, but FSR (upscaling the final render output) is even less magic than DLSS (upscaling multiple intermediate artifacts).

And games often omit settings from their menus, even those that work perfectly well in the engine itself. Having upscaling implemented by the engine doesn't necessarily mean the dev cares enough to implement the setting for it and test it.


> having to buy a 1300€ GPU, that one would have to replace every year

The flagships usually get refreshed every 2+ years, not yearly. If you're switching yearly, then every second change is gonna be a performance downgrade.

Furthermore, the flagship of last generation is usually at least on par with the card below the new flagship, so you'd have top of the line performance for at least 2 generations, which is 3-5 years. The 4080 specifically has worse performance then a 3090 unless you're using dlss/upsampling. Which most people on high budget systems don't want as it increases latency.

Your napkin math wrt the energy doesn't really make sense either, your card only draws so much power if it's actually on full load. My 4090 is usually mostly idle.

My MacBook pro actually draws more then my desktop PC with a 4090 if both are idle (that's because the MacBook has an integrated display which offsets the higher power draw of the desktop components). At least according to the measurements of my smart plugs that measure energy drain at the plug/wall

Finally, there is no way in hell a working adult will find time to play 4+hours day, every day.

Overall I'd say you've no idea what you're talking about and are just rationalizing and in denial. The performance you get from cloud gaming providers is terrible, worse then a budget PC which costs 1k in it's entirely. And you'll still have it at the end, while the cloud gaming subscriber will have nothing after the sub ends.


GeForce Now works great. I subscribed for a while to play Cyberpunk 2077 on a remote 4080 and it felt like playing on a local computer. But with much higher settings and framerate than I get on my local GeForce 1080.

> Finally, there is no way in hell a working adult will find time to play 4+hours day, every day.

Don't make the assumption that everyone live their lives exactly like you do.


A 1080 is 8 years old and wasn't a flagship, an equivalent card costs ~$100 currently... Also: that's 4 generations, not 2.

It'd be a complete disaster if gfn couldn't beat that performance.

I've attempted to use GFN several times too and can only conclude that you've never played at a decent setup if that felt like a local machine.

The input latency is so high that anything with pvp is entirely unplayable (it adds at least 40-80ms).

Maybe the only thing you're paying is Scenematic single player? In that case, sure. It works well enough. Not worth the subscription as you'd get better performance and experience from a steam deck, but to each their own.


I have and love a steam deck, and used to have geforce now for about 18 months before I got a 3080 new rig. We might be playing incredibly different games in very different ways, and that's ok - but geforce now being beaten by steam deck is, again, taking a reasonable position way too far.

Geforce now lets me play cp2077 on ultra with all the ray tracing on high resolution 32" monitor. Steam deck can "play" cp2077 at 800 resolution if you don't mind a melting jet engine in your hands. They're both great for what they do and are, and they can make a powerful combo, but in terms of performance I don't see how they can possibly be compared :-/

Edit : did you by any chance only try the free / low tier version of geforce now?


No, Nvidia always gives out a 3 month free trial with every purchase. I tried it out when I bought a 3080 when it was released and a 4090 last year.

> cp2077

That's indeed a game that works well with GFN, witcher 3 and similar games too. Pretty sure you're still playing at 720p on GFN though, they're just using dlss for upsampling. You should be able to see the upsampled resolution in the settings of cyberpunk if I remember correctly. My sub ran out however, so no checking anymore


Thanks for reply!

Fwiw - Cp2077 ran at full res internally. I've played it both with and without dlss on GFNow. I believe when I started it was 2080 for first paid tier and 3080 for the top paid tier. Now the top paid tier is 4080.


It’s great for turn based like BG3 or cities skyline. My biggest issue with it is how long it takes to start up a session


> Your napkin math wrt the energy doesn't really make sense either, your card only draws so much power if it's actually on full load.

Not only that, but there's evidence that more powerful cards can actually perform the same workloads as less powerful cards more efficiently. Of course, it's not going to be more efficient if you let it run at full tilt, but if, for example, you lower the power limit of a 3090 to 200W, it's still going to blow a 3060 Ti (also 200W) out of the water in terms of performance. Higher performance for the same power means you can also get the same performance at a lower power.

> Finally, there is no way in hell a working adult will find time to play 4+hours day, every day.

I work full-time and still get the hours of 5 PM to 12 AM to myself, which is 7 hours, and then I can still get 8 hours of sleep


I agreed with everything up to the "performance of cloud gaming is terrible " - I'm an extremely happy geforce now user. I have gigabit internet and use it with 27" monitor, historically for performance reasons, now that I got a 3080 for occasional convenience reasons (I have half a dozen Intel NUC around house, it's usually simpler and more reliable to stream game from geforce now servers thousand of km away than via steam stream from my upstairs pc - go figure)

Now! Neither one of us have spoken our assumptions out loud so let's do that - I play mostly single player games. There are people who mostly play pvp, cannot live without a 250hz screen and million dpi mouse, and those users likely won't enjoy cloud gaming. That does not mean "cloud gaming is terrible" overall but it may well be that "cloud gaming is terrible for high demanding pvp competitive usage " :).


.328 and they are complaining! Bah!


May I ask what benefit do you get from Geforce now when you already have a 4080 at home?


Noise/heat/electricity cost/not infesting pc with kernel level anticheat


What assumptions make it less than profitable at $20? If the hw costs are $1000 and if you'd need a hw setup per every 5 subscribers, you'd get $2400 in 2 years for $1000 in hw investment. Power, bandwidth and other costs are something but doesn't sound obviously unprofitable to me.


The hardware could be used for other things I think 1 4080 card tiers also 2 1080 cards equivalent. I wound bet they rent out capacity for other stuff during the day too.

I would also be surprised if they were only 5-1 on users still it’s probably not their most profitable venture.


First, you take USD4-5 off that in tax. Then more for support, finance and and marketing. Then electricity, bandwidth and insurance. Then games licences. Then hw...


Corporate tax is a flat 21%, and that's on profit, not revenue. I don't believe Nvidia pays anything for "licenses" either. There's no way that overhead should make them unprofitable even at retail prices.


I was thinking of sales tax / VAT.


I get better latency and frame rate than GeForce Now on a 2660 using moonlight. Using h.265 or AV1 is necessary. I'm not familiar with Presec, but I'll bet it's defaulting to h.264.


The default for Parsec is h.265, it's only on older machines that can't hardware encode/decode that fall back to h.264.

The reason they're likely seeing latency issues is that Parsec doesn't have the localized servers that GeForce Now has. Servers closer to your location will always have lower latency.

This is a simple case of nVidia simply having more money, more servers, and better infrastructure than Parsec. Which isn't surprising given the scale and cashflow difference of the two companies.


Parsec doesn’t use their servers though. It’s peer to peer once discovery is done. So the “server” is likely elsewhere in your house. That’s basically as close as you can get.


Oh, wild, I didn't know that. I definitely see some high latency on Parsec within my home.

It might be bouncing around through the net somehow instead of transferring directly within my local network? I wonder if something about local peer-to-peer discovery isn't working properly?

Maybe that's also happening to the other person comparing GeForce Now to Parsec?


I've been using Parsec on my local network for several months now and I see absolutely no latency whatsoever. GFN and any streaming provider gives me a very small but noticeable lag in shooters, but Parsec on my home network is genuinely perfect. I game from my Macbook and I play whatever I want from my homeserver (3090 and 4060) in another room. Both my Macbook and computer are wired of course. Important to note is that I needed a dummy HDMI on my computer for a 'true' screen, any kind of virtual screened caused it to lag hard, probably because the game or stream ran without any kind of hardware acceleration.

I have to say, I love the way Parsec works and allows you to use your whole library (since it doesn't stream the game, but it streams your whole desktop). But if there is ever a service that allows you to get the same thing including remote servers (since I travel), I would probably switch.


Appreciate the details you’ve provided!

I’m going to do some more Parsec experiments this weekend thanks to your info.

Something is clearly wrong with my setup, I’m going to figure it out.


I'm not a professional myself either, but the things I would check are:

- Make sure you run hardware acceleration, you can see if this is applied after opening a connection and checking the details. Without acceleration it will be terrible. This can either be as simple as enabling it or possibly you need the monitor below.

- Make sure a monitor is plugged in (there might be a way around this, but I just bought a $4 HDMI dummy instead of trying to figure out how to do this the software way)

Edit: formatting


In my experience if both sides are wireless the connection quality is terrible, no matter how good the signal is.


Yeah, everything here is 2.5G ethernet end-to-end. I don't think that's the issue. :)


I also enjoy Parsec at home (and for non gaming remote desktop). For some reason outside of home (I'm travelling to a city 10ms away) it's not as good


Peer-to-peer across networks is impossible unless you can punch a hole, i.e. UPnP. It's possible this isn't enabled and that could cause Parsec to have to fallback to a central server.


Why don't you use zerotier / tailscale


Because those aren't peer-to-peer in reality. They're peer-to-peer in terms of a virtual network that's often (not necessarily, but certainly in this case) implemented using a central server anyway.


Those are peer to peer. The coordination server manages the public keys, and exchanges the ip addresses initially. After that peers connect directly.

Relays are involved only briefly initially.


How is Tailscale not P2P? Just because there's a central server for discovery and key exchange does not somehow make it not peer-to-peer.

If the actual communications between two clients is P2P then, by definition, the VPN is P2P. Because that's the actual part people are talking about when they talk about VPNs.


I mean, sure, but that doesn't mean that Parsec uses central servers by default, which was the entire point of the GP comment; that their servers are worse than Nvidia's.


Parsec uses whatever server you tell it to.

Parsec will discover servers on LAN and/or with a port open and display them as distinct entries in the connection list, usually positioned in front of the central server options.

So there's no default as you're always the one choosing.


What? No. That's not how Parsec works at all. Are you talking about the same app?

Parsec displays computers to connect to. If a hole can be punched, it will use a P2P connection. If not, it automatically falls back to their STUN servers or the Parsec Relay server if you're on the Teams plan. There is no "server" to choose from. There aren't multiple entries for a single computer with different connection options. I don't know where you hallucinated all of these things but that's just not how Parsec works.

This article makes it quite clear that P2P is the default with a fallback to STUN: https://support.parsec.app/hc/en-us/articles/4440911373325-P...


> I don't know where you hallucinated all of these things but that's just not how Parsec works.

I opened the app and looked at it. I distinctly remember there being multiple entries if a computer is on the local network. I don't have another computer on the network right now so I can't check.


I'm not sure you would use h265 for "better latency" as the encoding overhead is greater than that compared to h264. Sure, h265 requires less bandwidth but it also is a slower encoder. AV1 is supposedly a fast encoder but hardware to encode it is limited to the recent generation.

Capture method can have a big effect on latency and technologies like the Desktop Duplication API or NV FBC are also necessary for achieving lower latencies by minimising data transfer between cpu and the gpu when capturing and encoding.


I’m getting a stable 5ms latency over ethernet with Sunshine + Moonlight (forced HW en & decoding). WIFI adds a few ms and you need a good router & mostly idle access point for perfectly smooth 4K. For FPS like L4D2 I do get better results (aim) running locally, but I ascribe that to the mouse acceleration weirdness which I never managed to configure _just right_.


This is my experience, 4-6ms. Even over the internet on a different continent I can game from my home PC with negligible latency with Moonlight.


Does the higher price tier work better than the low tier? I Usally get wierd stutter and resolution/packet loss during peak times or network congestion at the nvidia end. It’s typically related to new game or game season releases so I think it’s in nvidias end.


Beautiful, and agpl3 too! Can't wait to try it.

I think self hosted game streaming is a field that could use a lot more attention, from consumers as well. I set up moonlight/sunshine on my pc and now I can play cyberpunk on my 3080 machine from basically anywhere in the country, on my phone! Just got some gamesir controller thing to turn it into a sort of Nintendo switch type device and away I go. Actually, I have basically every emulator I could get my hands on as well, so my phone basically is a Nintendo switch so long as I have a decent internet connection. That's super cool to access my game library from any device that can run moonlight! (That's many devices btw. I recently found a moonlight client on the 3ds homebrew store. Haven't tried it yet but gives you an idea how absurdly widespread it is)

This stadia thing seems more about streaming through a browser so I'm really curious how things like Bluetooth controllers will work, whether there's onscreen controls available for mobile devices, that kind of thing. Will try it and find out! I REALLY love that it seems like I can share with my friends, as a jellyfin, audiobookshelf, konga, navidrome, and calibreweb / opds enthusiast, if I can add videogames to the mix that would be amazing. I already share my steam library through family sharing with some friends cause why not? Otherwise it's going to waste!


"and agpl3 too!" If you don't mind, I'm curious what you mean by this :) most service hosting providers I'm aware of (and worked with in the past) run away from AGPL3 software due to the viral nature of the license.


Isn’t that a good thing?


I'm a rabid FOSS evangelist and anticapitalist. If businesses hate it, I typically like it ;) but I also personally believe FOSS, even "viral" licenses (honestly I haven't seen evidence of this in practice though I hear it a lot about GPL), can lead to successful capitalist efforts. For example I believe that FOSS code leads to greater innovations that various organizations can capitalize on and drive profit through. SaaS hosting comes to mind, my co-op self hosts almost all of our saas, however there's one or two that are mission critical enough that we just fork over the cash to the experts to host.

In this case, there's already closed source solutions, such as stadia and GFN. Stadia shut down and took everyone's games with it (luckily offering refunds). Pure capitalist ventures are demonstrably unsustainable and don't serve the needs of the greater populace. But FOSS doesn't have that problem - if someone offers to host a FOSS GFN, you can pay to use their computers, that's really awesome! If they shut down their business, someone else can take it up, or you can host your own! Also awesome. I think there's valid business models there. I have a decently powerful gaming machine but I still pay for GFN for when I want to try a 4090 or when they have better throughput to my location.

Also recently Nvidia stopped supporting the API that self hosting option Moonlight uses, forcing the development of a pure FOSS protocol, thus leading to Sunshine (iirc), a similar self hostable game streaming server. Clearly the corporations don't care if we like things, they'll just shut them down (stadia, every other google product ever), so if these corpos come up with something people clearly like, it's a very good idea to build a FOSS version so we can keep having the thing when the corporation determines the profits won't reach several billion so it's not worth supporting (also some other company can then offer the service and simply make millions instead! Everyone wins)

Sorry I'm on my phone so it's a disorganized rant. I'm also a big user and hoster but I'm relatively ignorant so I might have gotten some terms wrong.


AGPL is good because it means people can’t sell a hosted service out of this without open sourcing their fixes, which is the spirit of open source: but GPL was designed before there was so much internet connectedness.


Really nice. What is your latency?


I'm not sure how to objectively measure this but if you have an idea I'm happy to give it a try and let you know!

For the record I live in Taiwan and apparently we have some of the best internet access in the world. And, I get 5g basically everywhere in the country, even when I'm deep in the mountains.

But subjectively I can say the latency is usually unnoticeable at like 30mpbs, though I'm sure a pro fps gamer would notice. The latency from the Bluetooth controller is far worse. What is a little annoying is the I think compression artifacts? Whatever it is that causes blockiness when I move my camera around, that isn't really visible on static objects.


This is super cool!

A few questions:

1. just how low is the latency? "Netris delivers a zero-latency gaming experience that won't eat up your data plan." What does this mean? Wouldn't the latency be at least from my computer, to Netris server, to the game server?

I would assume you can't play FPS/MOBA games on these?

2. "10,000+ games supported" "games that come with their own launchers are not yet supported" I'm curious about the technical differences between steam games and games with their own launcher. I guess I was assuming that you're more or less just pixel streaming to me(?)

3. how does it compare to GeforceNOW? Would love to see one of those long charts with checklists.


As a GFN user myself the biggest single problem is publishers getting a say in whether I can run their game on the service. With self hosted solutions like Netris you don't have those limitations because they're not baked into the software. In terms of other differences I'd imagine that you would end up paying more for this in the cloud. As someone else has mentioned, NVIDIA are likely to be subsidizing access to their GPUs.


> games that come with their own launchers are not yet supported

Please let this be the nail in the coffin for those extra clicks and annoying "Log in with our service now" for games which have no online stuff whatsoever (take2 is notorious for this)


Games that come with their own launcher start up 1 executable, then exit it and start another one.

It's really easy to track that first one. But the second one is a pain in the rear.

I'm guessing that's the problem they're still trying to solve without manually inputting info for each individual game, or making the player do that.


> « It's really easy to track that first one. But the second one is a pain in the rear.»

Tracking child processes spawns/handles, or even resorting to polling for process name for a while, all should be fairly simple to implement…


GeForce now used to have tons of "issues" with people easily being able to run any app or program they wanted, using a few tricks. Last time I tried it, it seemed like it was all locked down a lot better, but previously it'd still be steam opening the other program. It was nice while it lasted, since they wanted cyberpunk available day one, it was actually accessible about a week before the game released.


>"Netris delivers a zero-latency gaming experience that won't eat up your data plan." What does this mean? Wouldn't the latency be at least from my computer, to Netris server, to the game server?

I agree zero-latency sounds impossible. But why wouldn't the Netris server and the game server be the same machine? So that's 1 less latency hop.


Wait, I thought that you need either a hacked driver or a very expensive license to split up Nvidia GPUs, even within containers?


Wasn't it just a very expensive GPU which basically had a flag that enabled the feature in the driver? I seem to recall reading about people unlocking their NVIDIA GPUs to allow the nested virtualization.

--- Edit: here it is https://github.com/DualCoder/vgpu_unlock


Shout out to Wolf, which on Linux actually allows you to run multiple sessions on one GPU: https://github.com/games-on-whales/wolf


Thanks for the unexpected shout-out to Wolf (I'm the main dev behind it)!


I'm not sure this does split up the GPU. One of the listed self-hosted requirements is "Your Nvidia GPU should not be attached to a running X display server." Sounds like exclusive GPU use to me.


This looks like a really cool project! I do have two questions, though.

1. Do you plan to add AMD support?

2. What advantages does Metrics have over leaving a PC with Steam always on and then using Steam Remote Play?


Being exclusive to Nvidia GPUs doesn't sound like a good idea. What makes it rely on CUDA anyway?

It's cool that it's using Wine though. From these cloud services like Amazon's Luna or Geforce Now, none are running Linux, so this is interesting.

No idea why Stadia couldn't do the same and tried to reinvent the wheel with Windows ABI translation at some point before it shut down.


I guess the google team wanted to create an experience that is identical to the console gaming experience, in the sense that the users don’t have to adjust any settings before playing. Because of this requirement they couldn't use windows vms, because they sometimes have some issues like the game not being displayed in full screen. I also guess that Google probably saw some cost saving potential by using Linux instead of Windows.


I'd guess it does the video encoding on the GPU before it goes back to the PC. GeForce Experience recording does this as well I believe.


They don't need CUDA for video encoding?

For video they can rely on VAAPI or Vulkan video (the latter might be behind on AV1 encoding options though) which can work on any GPU with video acceleration hardware.


My understanding is that at a certain point the only way to keep climbing the corporate latter at google is for other's projects to fail and yours to make it which explains why they come out with a new messages app every 3 months before killing it.


Just to tag along, we are building a P2P GeForce Now at https://borg.games . It is not open-source though (and neither is this Netris btw, streaming is not part of the source repo).

Only a couple hard coded games are currently available, but we are close to getting GOG.com support.


Does anyone have experience using Nucleus co-op for remote play? It'd be cool to make use of this huge collection of games which have been sent into running multi-instance with a hijacked network stack & spoofed APIs, to make use of it for game streaming with friends.


When will 2560x1440p finally become a standard among these streaming services? It's either 1920x1080 or 4k, nothing in between (pricewise, but also client/browser wise)


Most gaming monitors also seem to be primarily 2650x1440. The 4k ones cost significantly more.


It's one of the reasons i cancelled Geforce Now, a total lack of support for Linux on higher resolutions. An alternative should consider the downsides of the competition. (don't get me wrong, i really like this initiative :)


Geforce now has 1440p




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: