Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Isn't that expected? 4K Blurays only encode up to like 128Mbps, which is 16MB/s. 100MB/s seems like complete overkill.


I think op just didn’t type Mbps properly. 100MB/s or ~800Mbps is way higher than the GPU can even encode at a HW level even I would think


100,000kbps. It will more than double that for 3240p.

https://i.imgur.com/LyrhNXZ.png


Right. That’s 223642 kilobits/s (kbps) in your picture or ~200MBit/s whereas you wrote (intentionally or otherwise) 200Mbyte/s a nearly 10 fold difference (100Mbit/s =~ 12Mbyte/s). 100MByte/s is 800Mbit/s or ~800000 kbps which is an order of magnitude more insanity than already choosing 100Mbit/s for live streaming (and not physically possible on consumer GPUs I believe).


It isn't for the amount of motion involved. Third person views in rally simulators and continuous fast flicks in shooters require it




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: