Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Show HN: Unreal Engine 5 in WebGPU (twitter.com/alexstlouis10)
63 points by astlouis44 on July 9, 2024 | hide | past | favorite | 24 comments


Hey everyone, here is a link to the demo if you'd like to try it out. Please note that this version runs without sound, and is CPU-intensive. We'll be enhancing performance and the overall quality of the experience in future updates, as well as improving load times.

Here's a link to where you can try out the demo. Feedback is much appreciated!

https://lyra.tiwsamples.com/


Looking at the console I quickly get "No WebGPU support; not starting" (which is correct when I check the compatibility matrix, I'm on non-nightly firefox) but the page just keeps saying "Loading" forever. Even for a demo it would be nice to surface that error to the user so they know not to wait around waiting for some potentially giant game to load (which was my first assumption).

Edit: Tried again in Chrome, the game loaded after awhile, but no 3d geometry rendered. I had a HUD with crosshairs, scoreboard, ammo. The game said it started. I could see another players name moving around, and responding to my cursor movements as if I was supposed to be in a 3d environment. Clicking when my mouse was a bit below their name seemingly killed them (or maybe that was a coincidence). But this was all just on top of a black screen.

On a M1 mac if that helps at all.


Thanks for the detailed report! We have recently received similar reports on M1 and M2 chips and we are going to investigate further.

Also, you're right about the loading screen for unsupported browsers, that is something we will add ASAP. Thanks again!


On Chrome for Linux you'll need

> The chrome://flags/#enable-unsafe-webgpu flag must be enabled (not enable-webgpu-developer-features). Linux experimental support also requires launching the browser with --enable-features=Vulkan.

Nightly Firefox should have it already.

https://github.com/gpuweb/gpuweb/wiki/Implementation-Status#...


FF: event is undefined

Chrome: Failed to load resource: the server responded with a status of 403 ()


And as usual, Chrome died, both on Windows and Android.


Wow, that's an amazing achievement.

At one point, UE had an option to build for HTML5, which was later deprecated and offloaded to a community fork [1] which was abandoned. So I thought it'd never happen. Well done!

[1] https://github.com/UnrealEngineHTML5


A friend and I forked UE4 a while back and stripped all of the bloat to try and get it running well on a shit-tier Chromebook. We only stripped it down to 30 mb, but the end result (physics + terrain) ran way smoother than any Unity game I've played on that ToasterBook.

Plot twist: people stop writing casual games in Unity and switch to Unreal to target low-end hardware.

Needless to say: as someone who spent my teens and early 20s working on multiplayer web games, I'm incredibly excited for this.


The problem is the editor, it still needs too much RAM when Unity doesn't, and that makes people drop it. Whether UE reduced the bloat indies would move instantly.


Oh please. Any gaming laptop can run the ue5 editor.


Yeah, I guess Epic expects indies to have plenty of resources. They can keep waiting for mass adoption of UE.


If I were an indie game dev, I'd rather pay for hardware resources than Unity license fees. It's $2K/year even if you're not making money yet. There is a free version but it's heavily limited (no physics, no VR).


Unity is free, can publish for free, it includes physics.


Unity is not free if you're part of a team that has made $100k in the last 12 months. Including funds raised.

Unreal is free, per project, until that specific project has earned over $1M in revenues.


Epic probably does expect the indie devs to have laptops, yes.


I opened the demo on my phone with chrome canary. It just crashed my browser.


Great to see more projects actually battle testing WebGPU. I've been working on porting an existing engine to WebGPU and personally I think it's one of the worst GPU APIs I have had the pleasure of working with. Perhaps if we see more practical real-world use-cases with real feedback we'll see some improvement in a later update.


Agreed, most studios would want to see maturity of the API, and that initial issues are ironed out.

I’d hazard a guess that WebGPU is going to usher in a new golden era of browser games, especially now that features such as compute shaders are on the table.


I seriously doubt it, WebGL 2.0 so far failed to deliver something like Infinity Blade from 2011, the flagship game of iOS mobile gaming with OpenGL ES 3.0.


This is awesome, I love using 3D engines for visualisation and I’m already writing my own GPU shaders to visualise electron density grids for example - having a high performance runtime I can hook into would be such a big help. Especially if it had a high performance (60fps+) particle system that can handle 10s millions particles


Press X for doubt that you can run 10s millions of particles in unreal on the browser.


If you use shaders and are smart about GPU data structures you can do it, I have got 5-6 million particles running smoothly (60fps) in my own testing which is just a hacky weekend mash with no optimization, here's a blog post of someone who went into detail on something similar here:

https://dgerrells.com/blog/how-fast-is-javascript-simulating... From the blog post:

> This version can hit 20m particles on on M1 mac at around 20fps on battery life. That is pretty crazy for pure javascript. My desktop can do about 30m at the same fps. I asked a friend who had a 32 core CPU to test it and they hit about 40m which is almost enough to fill five 4k displays.

Having a full engine toolkit at my disposal with mature sytems for geometry, physics, particles, templating etc would make me much more productive than having to write the shaders myself, even if the engine has some overhead.

My comment above was a wish list, I know I'm asking for a lot, but I make scientific visualization software so I like to push at the edge!


I’m sorry, I am dumb.


If it’s fully on GPU that should be tenable, regardless of browser or not




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: