I really doubt an ethernet connection can push a full HD video frame in 1ms. The <1ms is for ping, which uses a really small network packet. Pushing 1080p HD video is a totally different matter.
$ ping -s 1450 192.168.11.10
PING 192.168.11.10 (192.168.11.10): 1450 data bytes
1458 bytes from 192.168.11.10: icmp_seq=0 ttl=255 time=0.646 ms
1458 bytes from 192.168.11.10: icmp_seq=1 ttl=255 time=0.478 ms
1458 bytes from 192.168.11.10: icmp_seq=2 ttl=255 time=0.469 ms
It would be insane to do this, but you could shove ATSC between boxes over ethernet by shoving each 188-byte MPEG transport stream packet in an ethernet frame and skipping all the layer 3 stuff. Or UDP it, if you want to route it. They probably like the idea of a difficult to route protocol keeping data on one ethernet segment.
My HDhomerun ATSC receiver certainly has no problem shoving a couple megabits of high def video over ethernet. Nor does my mythtv setup, or even just plain old NFS shares to watch videos.
For raw 720p RGBA you'd send:
1280 * 720 * 32 = 29,491,200 bits but obviously you're not going to do that. Suddenly you're not only sending data but also encoding / decoding it.
Indeed.. assuming 60fps, that works out to around 1.8 Gbps. Well within the range of HDMI, but well out of the range of your off-the-shelf router on ethernet.
..though 802.11ad is supposed to be out early 2014, and that maxes out around 7 gig, and has cooperation from the HDMI consortium to use it for streaming. I wonder if some kind of some kind of ultra-high-speed wireless dongle is in Steambox's future..
It doesn't matter, that's the point. Your MTU is almost certainly 1500 bytes. You are sending 1500 byte packets at the most. That does not cause latency. If you want to argue that we're incapable of encoding or decoding video with acceptable latency go right ahead, but doing it in response to me correcting a misconception about network latency doesn't make much sense.
Having worked on the particular issue of streaming real-time video over wireless, I can say that the main issue is not link latency.
The main lag comes from encoding/decoding. If you do it naively you encode frame-per-frame (encoding slices is more difficult), and the encoder does not only outputs iframes: you get partial frames that depend on both previous and future frames. Also the decoder does not always output frames in order. So you have to expect something around ~10 frames of latency, maybe less if you optimize everything well enough. That still means easily more than 100ms of lag.
I would think that the people at Valve would be able to find a solution if anyone could... or are you claiming that this is a hard nut to crack and SteamOS's streaming solution won't end up being that great for high-resolution TVs?
Oh I'm sure if they put their minds to it they can make some improvements and clever optimizations. The thing it becomes exponentially more difficult the lowest the latency you want to achieve, obviously. <300ms? easy. <100ms? manageable. <50ms? hey, very good! <10ms? uh, I want to see it with my own eyes.
Also, you have to remember that steam won't control the encoding end of the pipeline (and if the "steambox" is third party hardware with steamOS installed, no control at all on the hardware). Which means that in the end the observed latency will depend a lot on the hardware and drivers of the desktop PC and there isn't much Valve can do about that.
So in the end I'm sure it'll be more than fine to play Civilization or Torchlight, MMOs and most RPGs but maybe not Counter Strike or Quake III.
Why are people acting like this is impossible even after it has been done? Remember onlive? Notice how the latency was entirely the same as your network latency to their servers, and there was no problem with encoding adding any (noticable) additional latency?
What about scrapping the conversion to streaming video entirely and replacing it with a networked graphics protocol that allows one PC to draw on another's graphics natively?
Well that would simplify things greatly of course, but then it means bit hit on the bandwidth.
I mean, a 720p60Hz stream in 4:2:0 (12 bits per pixel) still amounts to 663Mbits/s. You won't get that out of a gigabit link realistically (at least not over IP). Of course you could use a lightweight compression algorithm, but you'll have to divide this bandwidth by at least 5 to make it manageable for the average home network I'd say (I have absolutely nothing to back that last number, but 100Mbits/s doesn't look too scary...). And that's only for 720p remember.
I think if you plan to stream HD video over the network you have to encode and be clever about it.
It seems to me, intuitively, that the only ways to do that are:
1. Essentially equivalent to streaming, or
2. Essentially equivalent to normal CPU->GPU communication but over the network (and, thus, needing the bandwidth that local CPU->GPU communication has if you want to avoid slowing things down -- GigE wouldn't seem to be enough, much less WiFi, even if you consider only bandwidth and not latency.)