Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't think it's quite that simple. DP is still missing HDMI features like ARC and CEC, which are important for TVs. Even on my personal computer setup, I use the HDMI 2.1 ports on my monitor/GPU over the DP 1.4 ports because the DP port just doesn't have the bandwidth for 2560x1440 @ 240hz with 10-bit colour. That requires ~30 Gbit/s, more than DP1.4's ~26 Gbit/s.

Neither my monitor nor my GPU support DP2.0 which does have enough bandwidth. So until I upgrade both, I'm using HDMI. My computer is not outdated either, there's just nothing to upgrade to. None of Nvidia's consumer GPUs support DP2.0, and I can only find 2 monitors that support DP2.0. Anyone getting new hardware now will be in a similar situation to use HDMI2.1 over DP1.4 until their next upgrade.



> DP is still missing HDMI features like ARC and CEC, which are important for TVs.

ARC could also be considered as a bug, a hindrance, or both.

ARC and its various implementations would not exist if the HDMI Forum would not be so fanatically force copy protection on everything. The whole problem, or feature that ARC is or is not, would disappear with the reliance of protecting every stream. The alternative would be a full datastream, decoded, going back to the device in question. The prerequisite would be to remove the shitshow that HDCP is and allow full-blown HDMI-in and outputs, which is the exact opposite of what the Forum wants.

HDMI in its current implementation hinders technological progress in the audio segment by forcing everyone to output analogue signals after the decoding stage or not allow decoding at all.


Don't you also need ARC because of video post-processing that adds frames of latency? The TV needs to send audio back to the receiver otherwise video and audio will not be in sync anymore. Receivers/amplifiers can process audio with practically no latency so it makes sense for them to be at the end of the chain.


You don't need ARC to address a/v sync. HDMI has (optional) metadata somewhere (edid?) where the display device indicates its delay and the audio device can adjust accordingly. It's helpful if the display device has fixed delay for this feature to be most effective; it's fine if there's different modes with different delays and the current delay is communicated, but some modes have variable delay depending on content which is terrible in several ways.

IMHO, ARC is primarily useful when the display device is also acquiring the content: it's running the TV tuner or internet streaming or content off a usb drive. It's also useful if you have a 1080p capable receiver and upgrade to a 2160p(4k) display and sources: if you put the receiver in the middle, you lose on video quality, but with eARC the display can route full quality audio from all your sources. Some sources do have two HDMI outs, so you could wire to the display and the receiver, but that's not very common.


Ugh, DisplayPort already has the audio channel. As far as sync, neither protocol provides for effective reclocking or supplies the audio clocks, and you need VRR to provide sort of display clocks.


DisplayPort has the audio channel but AFAIK has no _return_ channel. Which is not needed in the typical computer setup but quite useful in a TV + soundbar or AVR setup.


I admire your exasperation on this issue :)


I agree, but I also think your illustration of the problem is a bit off. The 90% gauss curve center part of customers doesn't need the tail end of display connector bandwidth.

However, devices have a lifecycle, and a lot of this hardware will still be in use in 2-3 years, where this will have moved into the center part of the gauss curve. Higher resolutions and HDR (which may push 10bit) will trip this much more than a 240Hz display [which ain't ever gonna' be mainline, really, considering we went down to 60Hz from CRTs with faster refresh rates]

CEC can be done over the DisplayPort AUX channel. I think there were attempts at an ARC equivalent but they floundered.

Another interesting question though is how much A/V connections in general will still be used in the "TV world" down the line… with everything moving towards more integrated networked appliances instead. E.g. streaming service "HDMI sticks" are now apps on the TV instead…


I agree that it's an issue very few customers are going to run into. But also that's where the differences in DisplayPort and HDMI are. For those 90%, they're equally served by HDMI and DisplayPort and will just use whatever they have.

Another 10% feature difference I do like on DisplayPort is Multi-stream transport for multiple monitors over a single cable. I don't think many people are looking to daisy chain big screen TVs.


ARC and CEC are only necessary because of this stupid situation where TVs are like displays with shitty media centres built in. ARC is only a tiny bit more convenient anyway; it's not that hard to run an audio cable back from the TV to an audio receiver and you'll be hiding the cable anyway so it matters not the slightest what it looks like.

In 2002 there was XBMC (later renamed to Kodi). Microsoft even had Windows XP Media Centre Edition in 2005. At that time it was perfectly possible to set up a media centre that could do everything. No need for shitty TV remotes and CEC. You would use a much higher quality remote of your choice. Oh how far we've come in 20 years...


> it's not that hard to run an audio cable back from the TV to an audio receiver and you'll be hiding the cable anyway so it matters not the slightest what it looks like.

That's fine for regular ARC which is basically the same capability as spdif, ATSC audio and DVD audio. But there's no consumer audio cable that has the capacity for lossless surround except for HDMI, and then you really want eARC because otherwise you have one HDMI running from the receiver to the TV for video (and maybe audio) for sources that can go through the receiver, and a second HDMI that runs from the TV to the receiver for audio only for sources that can't go through the receiver (built into the tv like the tuner, network streaming, and playback from USB; and also devices that exceed the HDMI bandwidth of the receiver or don't negotiate to an appropriate video and audio format unless going direct --- I have a 4k Roku and a 1080p BluRay player that need different settings on the TV to work through my receiver, or I can wire one source direct to the TV and use eARC)


Does eARC support AAC audio for surround sound or is it only DTS or AC3?


I'd guess AAC is technically possible, but not actually supported. A list of formats from a random current receiver is:

2-channel Linear PCM: 2-channel, 32 kHz – 192 kHz, 16/20/24 bit

Multi-channel Linear PCM : 7.1-channel, 32 kHz – 192 kHz, 16/20/24 bit

Bitstream: Dolby Digital / DTS / Dolby Atmos / Dolby TrueHD / Dolby Digital Plus / DTS:X / DTS-HD Master Audio / DTS-HD High Resolution Audio / DTS Express

I'd imagine whatever source is getting AAC is going to need to decode it and send as linear PCM, which should be fine.


In my experience multi-channel AAC gets sent as multi-channel LPCM over HDMI, whether that be eARC or not. That's fine though, I don't really care what part of the chain does the AAC decoding because it has to be turned into LPCM _somewhere_.


It's still a perfectly valid choice.


>it's not that hard to run an audio cable back from the TV to an audio receiver

Wait until you find out that many consumer sound bars (Sonos comes to mind) only support the latest and greatest digital audio formats over eARC.


OK but audio technology of the 80s sounds better than the "latest and greatest formats" on a shitty soundbar so who cares?


Speak for yourself but I'd rather have LPCM surround audio than deal with proprietary formats like Dolby Digital and DTS which are the only way to get surround without using eARC over HDMI.

This has literally nothing to do with any kind of sound bar, toast0's reply to your original comment explains the situation in detail.


There is another way: decode it in your media centre and send it analogue to your amplifier. Remember when media centres were actually capable? It has to be decoded to analogue somewhere. Dolby digital and DTS are not the only way to get surround (also good stereo is better than shit surround, but let's assume you mean good surround).

The whole thing about HDMI is a circular argument. You can only use HDMI because you can only use HDMI. There's nothing technical stopping another cable supporting this stuff. That was my original point. We're in this situation for silly reasons, not technical reasons.


That's assuming you're trying to avoid DSC. With DSC you can easily get 1440@240Hz even on DP1.4.

Most monitors don't ship with DP2.0, because it's just not necessary. All modern GPUs support DSC, so monitor OEMs take that free 3x bandwidth reduction.

Nonetheless, Nvidia shipping RTX 4000 without DP2.0 is baffling.


Yes, I'm avoiding DSC because Nvidia GPUs have an issue with DSC when switching to full screen games that cause a black screen for several seconds.

DP 1.4 also does have enough bandwidth for 1440p@240hz without HDR, so I only have this issue with HDR on.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: