Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
HDMI 2.2 is set to debut at CES 2025 (tomshardware.com)
58 points by thunderbong on Dec 14, 2024 | hide | past | favorite | 128 comments


HDMI should have been abolished in favor of DisplayPort ages ago.

https://hackaday.com/2023/07/11/displayport-a-better-video-i...

There are a couple of video cameras and viewfinders (particularly from BlackMagic) that have emerged in the last couple of years using DP.


HDMI should have made it easy to see what cable is supporting what.

The HDMI versioning chaos did not help.

The worst is still USB 3.1 Gen 2.1 Hyper Giga Super Speed Mark II


Yeah I have no idea how normal non-tech people are supposed to make any sense of USB-C cables and device compatibility.

Even us tech people find it difficult. Small rant - I have the M1 Max MacBook which comes with “USB4”. I remember at some point getting a SanDisk external “USB4” ssd which claimed 20Gb/s, you would think it should all work together. But no, apparently the ssd needed 3.2 2x2 support, which the Mac doesn’t implement, since it’s apparently only a optional part of USB4 spec…

One other funny thing - Apple’s page for “Identify the ports on your Mac” has 5 separate entries that are all USB-C shape. https://support.apple.com/en-us/109523


Unfortunately everything you said, aren't new. And even before the new spec were done, before USB-C was a thing. All of these problem were predicted and raised.

No one would listen. You wont even find people supporting these scenerio as a problem on HN. Before 2022, 99.9999% of HN and generally all tech comments somehow thought it will all be Thunderbolt speced.

What makes it even worst is that Apple falls into the same trap as the rest of Internet / Tech / HN. I can almost guarantee this wont happen under Steve Jobs.

Unfortunately that battle is long lost. The world will have to suffer enough pain before it can learn and do something.


The simple but disappointing answer is that if you want a guaranteed level of functionality, you need to be looking at the Thunderbolt branding and ignore almost all the USB nomenclature. IIRC, the inability of some Apple Silicon machines to drive multiple external displays is why they're only advertised as "Thunderbolt" rather than TB3, 4 or 5: the specs are strict enough that even Apple can't cut corners without consequences.


I don't think this is true. If you look at the M4 MacBook Pro spec pages, it says Thunderbolt 4 for the base model, and Thunderbolt 5 for the M4 Pro models...

https://www.apple.com/macbook-pro/specs/


The ones listed as "Thunderbolt / USB 4" ie. not Thunderbolt 3, 4 or 5, appear to be exactly the models that are limited to one external display (I'm not 100% sure about the iMacs). The models you mention with M4 and M4 Pro are all capable of driving multiple external displays, and the difference between TB4 and TB5 for those models is one of the maximum data rate (40Gb/s vs 80Gb/s with an asymmetrical 120Gb/s mode).

But I've never actually read a Thunderbolt spec, so I can't be sure exactly what disqualifies those "Thunderbolt / USB 4" from being advertised with a more recent Thunderbolt version number.


Just wait til they extend USB to carry this new super speed HDMI.


DisplayPort never caught on in the home entertainment market - have you ever seen a DisplayPort TV? And why would there be one, when everyone’s happy with HDMI?

DisplayPort, for most intents and purposes, has lost the format war. You buy a laptop - it’s HDMI on the side, not DisplayPort. You buy a monitor - DisplayPort is never an exclusive port on any model, but HDMI is.


Why do you assert that everyone is happy with HDMI? It has a lot of problems even before we get into the technical details. For example, it is so secretive and closed that the HDMI forum won't even let AMD implement HDMI 2.1 in their Linux driver: https://arstechnica.com/gadgets/2024/02/hdmi-forum-to-amd-no...


Wasn't the DRM black box the whole reason why HDMI exists? Like I personally don't give a shit about that but content distributors do.


[flagged]


Why would a random small business owner have opinions about DisplayPort vs HDMI...? I do not think that is a very useful benchmark for evaluating technology standards, and it is kind of a conversation killer to use it to avoid engaging with specific technical points people are bringing up in reply to you. This is a technical discussion forum, after all.


By that logic HDMI would never have gotten popular in the first place. No small business owner ever cared about what cable they connect to their TV as long as they have access to it.


> DisplayPort, for most intents and purposes, has lost the format war. You buy a laptop - it’s HDMI on the side, not DisplayPort.

The laptops USB-C ports probably speak DisplayPort though. There was another USB-C alt mode which carried HDMI instead but that spec was abandoned, DisplayPort is the de facto standard USB-C video protocol now.


Yeah, DisplayPort "lost" w/r/t the connector but it seems like DP-over-USB-C is the way forward.

Being able to put video, audio, power, and USB connectivity over one relatively common connector seems like the obvious choice.


I would argue the opposite - DisplayPort has won the computing format war, especially on laptops. Most laptops will have at least two USB-C ports with DisplayPort tunneling. Most business PCs have only DisplayPort connectivity, with the assumption that you'll buy a DisplayPort to HDMI adapter (if you have an older monitor that only supported HDMI).

True, we will never see them on the TV, but on the computer it is all DisplayPort.


If you buy a ThinkPad, an IdeaPad, or a MacBook Pro, what port is on the right side?


My MacBook Pro has one HDMI and one DisplayPort on the right side. The left side has two DisplayPort and a MagSafe.

Edit: And a MacBook Air would have just the two DisplayPort on the left with no HDMI, plus headphone jack on the right.


The MacBooks HDMI port is also just a usb device internally. Utilising DisplayPort alt mode.


I just checked my thinkpad fleet, excluding VGA: one has no extra video port, one has HDMI, and 4 have displayport only.


Thinkpads will usually have multiple DisplayPorts (some as alt-mode on USB-C), and 1 HDMI for use with roach-motel-quality conference room setups.


HDMI replaced VGA as the "lowest common denominator port" for conference usage, not as major display tech.

HDMI is also for various reasons very tightly bound with home entertainment ecosystem because the DRM is mandatory while it's optional in DisplayPort.

Meanwhile DisplayPort has effectively won the format war for computer displays, and on all newer display connectors.


> Meanwhile DisplayPort has effectively won the format war for computer displays, and on all newer display connectors.

Not on this planet - for the highest end displays, maybe. Almost every small business and government office I see is still using VGA.


>Almost every small business and government office I see is still using VGA.

I dont know where you are from but all Government Office and SME I have been to ( and I have been to a lot ) all have HDMI for their TV since at least 2020. They may have VGA lying around but they are not used. In fact the last time I went to an office that doesn't have HDMI port and only VGA was roughly 2016 or 2018.


I'd say not even the highest end displays. I'm using HDMI 2.1 on my RTX 4080 because it has more bandwidth than the DisplayPort 1.4 ports on this graphic card.

It's a similar port selection on even the highest end monitors: 40 Gigabit/s HDMI 2.1, or 26 Gigabit DP1.4. I'm unable to find a single DP2.0 monitor.


IIRC main difference was that DP1.4 at 26gbit was more than enough to drive 4k displays


Sure, but if we're talking about high-end displays, 4K by itself isn't exactly high-end anymore. A 4 year old Playstation 5 can output not just 4K, but also 120hz with 10-bit HDR colour. This requires ~32 Gigabit/s, more than DP1.4

https://linustechtips.com/topic/729232-guide-to-display-cabl...

I'm personally on a 1440p240hz monitor. With HDR, this barely squeaks over DP1.4's bandwidth.


Well, yes. Another part that prevented more rapid adoption of DP2.0 other than longer and longer upgrade cycle of GPUs and displays is that another option is to just use 2 cables for 51 Gbit/s which was also a thing with early 4k displays.


Did DisplayPort support bonding signals like that? Or would it just show up as 2 separate monitors to the GPU?


It requires compatible GPUs, but it's how early 5k@60Hz displays were connected using DisplayPort 1.2, and how very large displays are connected in general. Among displays that used this was for example Dell UP2715K and early LG UltraFine 5k, but I heard that Apple's XDR display uses it as well to avoid utilizing DSC by setting up 2 DisplayPort 1.4 tunnels over TB3 and using tiled mode.

The display device provides DisplayID block id 0x28 "Tiled Display Technology" with description of tiling, which is then used on GPU side to properly configure framebuffer -> DP sources mappings so that from userland you see one display.

The DisplayID block also explicitly supports things like declaring bevels etc. if necessary.


That's just nvidia being cheap.


Interesting, I don't think I've seen VGA anywhere outside retro setups or places that were setup once long ago and left like that. I think many people don't even know there was such a thing as VGA these days if they work in office.

That said, HDMI 1.0 being DVI-D in different connector means conversion to VGA is very easy, as DVI-D stream is essentially VGA data minus DAC step (including vsync blanking and overscan).


> HDMI replaced VGA as the "lowest common denominator port" for conference usage, not as major display tech.

I know a fair share of rancid conference rooms where the cabling is still 20 years old...


The decision to replace VGA with HDMI on Thinkpads was literally driven by market research deciding that likelihood of encountering HDMI setups was bigger than VGA by then.


That’s not really true. Any device with USB-C video out is using DisplayPort, not HDMI. HDMI alt mode exists, but AFAIK never shipped in any products.


There is also MHL alt-mode over USB-C. MHL was a way to do video over microUSB at the time and I really can't figure out if any products released with USB-C MHL support either.


If it’s not the physical connector, is it even worth calling DisplayPort? It might as well be called something else, that happens to run over a DP connector or a USB-C connector. It’s winning in a spiritual sense where nobody intuitively knows it’s DisplayPort.


It's worth it because it matters to simple cases of normal user needing a converter dongle. Passive USB-C DP-altmode to DP cable are easy to acquire, cheap, and work fine.

USB-C -> HDMI is usually expensive because nearly nobody uses HDMI as alt-mode outside few old devices so you end up with frustrated users who try to buy cheaper converter and find out it does not work.


Looks to me like USB-C/HDMI cables are in the 5-20 EUR price range, about the same as USB-C/DP. I just got a new USB-C hub with HDMI port (4k@60) for about 15 EUR. Simple HDMI or DP cables (ie. same port on both ends) are available for 2-3 EUR, are there USB-C/DP cables in that range?


I wish the opposite was easier to find, since I have one display device with only a USB-C input, and a desktop computer.


I had this issue as well - Amazon (did?) have a bi-directional USB-C to DisplayPort cable [0], which works well for me. It looks like the 3ft one is still available.

[0] - https://www.amazon.ca/gp/product/B081VK1KHV/


I'm seeing less and less HDMI ports on computer monitors, graphics cards, projectors, etc. Since IIRC they charge a royalty per port, this makes a lot of sense.

I buy a laptop - it's USB only. No HDMI, DisplayPort, VGA, headphone jack or anything else. When was the last time you bought one?

I buy a desktop graphics card. 1 HDMI - for compatibility, you know - and 3 DisplayPort. The HDMI is port number 2.

Last time I used a projector it was DisplayPort or VGA. I brought brought both kinds of adapters just in case.

My newest monitor has power input, DisplayPort and VGA. No HDMI.


"I buy a laptop - it's USB only. No HDMI, DisplayPort..."

So it has no video output?


It has a screen, and it has the USB-C port alternate channels connected to the GPU.


Besides DisplayPort over USB-C that people have already mentioned there's also eDP which is basically the standard for connecting a laptop's screen to its GPU. You buy a laptop, regardless of what external port it has, DisplayPort is used on the internal display. And since most laptop users don't regularly connect to external displays, one can certainly say that the usage of DisplayPort here far eclipses HDMI.


DisplayPort in a way won. USB-C video is DisplayPort. And the HDMI port on your laptop is just an internal USB adapter for DisplayPort alt mode for hdmi.


DP doesn't have eARC for audio, that won't work.


DisplayPort "fast AUX" stream is more than capable of returning audio or even video data to the "source" and some videoconferencing rigs even implement this. But I don't see why the industry would be motivated to adopt and consolidate around that standard when HDMI's bidirectional features already exist and are widely supported.


Aux is its own channel. And one I seriously wish for better used! It seems to only be USB2 capable but even that is rarely implemented!! So sad.

But one of the neat things about DisplayPort is that it's a a packetized protocol, that many streams can go over. Which is why there can be multiple daisy chained monitors!

So hypothetically the computer could stream out all manners of high bandwidth data. It's been neat to see a USB4 adaption, where DisplayPort tunnels USB4 for data coming from the computer, and the aux channel is used as the 480mbit return channel.

I don't really have any knowledge of why the aux channel is regarded as not capable of multi gbit/s, and maybe that's not even true.

Given what a terrible inflexible trash porting of the past HDMI is though, sure makes me wish folks would do more with all the possibilities DisplayPort has. And it makes me really really wish consumer electronics would please please please offer some DisplayPort. Given that usb-c mandates DisplayPort as a required alternate mode video system, maybe possibly hopefully this will be some leverage that starts prying these HDMI rent seeker's stranglehold open.


The physical layers of everything are converging. The breakthrough we need is on the logical layer, with peers describing their capabilities and working out a sensible, predictable suite of resulting functions. Right now there are _zero_ conventions for what the DP Aux stream should do, so random member of the public has no expectations. HDMI for the most part does what normal people expect. USB has maybe a few too many possible outcomes ... for example yesterday I discovered that my Pixel phone was charging my MacBook, instead of the other way around.

Unfortunately working out a set of norms and expectations that are useful to users is harder than just making bits go over wires.


"...this would align the new HDMI standard with the latest DisplayPort 2.1 technologies"


CEC should be required by the spec. It's tantalizingly close to "just works" but marred by devices that ought to support it but don't, like my RTX 3060/Windows (which?).

Another fun issue I have with it is that my Apple TV 4K will randomly (1 time in 3-4 hours?) tell the receiver to change input when I'm on another. I'm guessing there's some scheduled wake happening and it inadvertently treats it as a user interaction. Adds a nice element of difficulty to Elden Ring.


My favorite is when my receiver decides to send audio to my PROJECTOR instead of to the goddamned speakers that are attached to said receiver.

I heard tinny sound coming from somewhere behind me and was baffled when I approached the projector and heard sound coming from its tiny built-in speaker.


Is there any reason why most TVs still don't support USB C display port and just support HDMI?

With the recent drama around HDMI vs. Foss drivers on Linux, I'm curious why haven't we seen that bigger push from TV vendors to support USB C display port.

Most of my monitors work with USB C, Even consoles like steam deck, a lot of high end phones seem to support USB C.

So... Are there any features of HDMI that USB C display port doesn't support?


DisplayPort in native mode lacks some HDMI features such as Consumer Electronics Control (CEC) commands. The CEC bus allows linking multiple sources with a single display and controlling any of these devices from any remote. DisplayPort 1.3 added the possibility of transmitting CEC commands over the AUX channel. From its very first version HDMI features CEC to support connecting multiple sources to a single display as is typical for a TV screen.

The other way round, DisplayPort's Multi-Stream Transport allows connecting multiple displays to a single computer source.

This reflects the facts that HDMI originated from consumer electronics companies whereas DisplayPort is owned by VESA which started as an organization for computer standards.


> From its very first version HDMI features CEC

That depends on whether or not you consider single-link DVI-D as the "very first version" of HDMI.


DP has a different management interface that's actually better because it's a standard unlike CEC.


I imagine CEC could be added to DisplayPort.


DisplayPort->HDMI was the only way I could get my PC to control the TV it was attached to over CEC, since (apparently) the CEC pins aren't enabled on most consumer GPUs, but the DisplayPort aux pin can be used as an alternative.

So the linux DisplayPort driver at least supports the feature, whether or not it's fully standardized/required.


I don't know how true it is, but I heard that

- TV manufacturers are often (always?) members of the HDMI consortium, meaning they financially profit from each device that has an HDMI port.

- Manufacturers of devices with HDMI ports are heavily discouraged from also including competing ports like DP.


DRM is pretty much mandatory in HDMI in home entertainment setup, and is controlled by the media companies' cabal, which is important to vendors in that space (who are often members too).

Also, lowest common denominator HDMI in TV format has HDCP decoder as the only complex part, you can theoretically drive a dumb panel with few fixed function ICs that aren't even related to HDMI. Simplest dumbest case you can advertise the bare minimum supported display setting in EDID, and plop few buffers and ADCs and drive an analog TV off it.

Meanwhile simplest possible DisplayPort implementations still requires that your display can handle packetized data sent over PCI-E PHY layer, with more complex data structures and setup information than "plop an I2C rom chip here, few ADCs and buffer logic here, and you've got HDMI without HDCP to analog CRT".


The world doesn't care about FOSS drivers.

Anyway, HDMI is for TVs and DisplayPort is for monitors. They're both entrenched enough that it doesn't make much sense to try to cross over.


Ok, but, why not just try to converge on one connector? as you say, the world doesn't care...


Fundamentally different goals.

HDMI world wants DRM because other than Kim Dotcom (who is highly problematic for other reasons) no one has dared to stand up to the MAFIAA goons.

Computing world wants DisplayPort and a hassle-free, high quality experience. DRM breaks that.


The manufacturers very much care - every HDMI port (physical or virtual) involves tithe to the HDMI consortium, whereas DisplayPort does not.


I don't follow - surely if the manufacturers have to pay for every port, they'd like something that isn't costing them money?


Which is why in my experience on computer hw it's common to see one, maybe two HDMI (among other reasons, to support things like connecting a console to your monitor), and multiple display port connectors - sometimes it goes ridiculous - a maximalist approach to counting on my laptop, as currently docked, from the point of "ports implemented in GPU" if not actually available physically... gives me one HDMI and 15 DisplayPort, with the HDMI being routable onto a bunch of options. Of course that's a silly comparison, but depending on how licensing is worded I would be really unsurprised if computer manufacturers optimized for lower HDMI port count.

Also, since older HDMI was physically compatible with DVI ports, a lot of computers preferred to keep DVI ports...


>why most TVs still don't support USB C display port and just support HDMI

Cost. USB c has much more overhead. For example, people expect they’ll be able to charge with it and have usb pass through.


>Is there any reason why most TVs still don't support USB C display port and just support HDMI?

Personally I’m convinced it’s mostly because the display manufacturers want to discourage the use of TVs as monitors, in order to protect their margins on monitors.

8k monitors should be sub 1000 usd by now and standard for anyone working with screens. You can get that as a tv but not as a monitor. :(


CEC is probably one of them, but I'm not sure that's a bad thing.


I don't think DisplayPort has an equivalent to eARC either.


I feel like it will be a long time before this is considered essential, if it ever is. An end-to-end 2.1 setup will get you 4K120 with e-ARC and numerous flavours of VRR and HDR (though I don't think those last two are necessarily tied to a specific HDMI version).

Heck, it's still rare to find a TV where all HDMI ports are 2.1 (for years it was only LG, not sure if it's changed this year).


> still rare to find a TV where all HDMI ports are 2.1

On a couple of 4K Samsung monitors I have, HDMI port 1 is HDMI 1.4 and HDMI port 2 is HDMI 2.0.

Couldn't be simpler. The HDMI version is on the port label. Something about the situation makes me chuckle.


Why aren’t they all hdmi 2.0, isn’t it backwards compatible? Or it’s just to save them a $ while technically still being able to advertise it as hdmi 2.0 ?


I'm going to assume it was cost cutting.

The boss said to make sure it had an HDMI 2 port. The engineer could have interpreted it two ways, so they did both ways in one.


Probably it would need a better CPU to handle more traces and that's the real expense


VR needs it. You’ve got two displays that have a very high resolution and refresh rate, which need to share one cable.


What I want is a display standard that allows (or legally forces) a monitor to detect the signal within 100ms.

Waiting for a monitor for 5 seconds (often without knowing what it is doing or on which input it is looking) is such a bad UX.


It actually has little to nothing to do with the connector standard but more of the TV system itself. I know there is HDMI / HDCP handshake but they dont contribute much to the 5s of switching channels.


I'm talking about switching sources on a monitor that's already on ;)


It should also be legally mandated to have individual buttons for each input instead of having to navigate a silly software menu, for when the auto detection invariably fails !!


> If confirmed, this would align the new HDMI standard with the latest DisplayPort 2.1 technologies, offering consumers expanded options for ultra-high-definition media and gaming experiences.

Why do we need new HDMI when there already is DisplayPort for this?


HDMI supports features needed for entertainment centers, like CEC and eARC. DisplayPort doesn’t have these.


Curious. Never heard of these. Thanks.


Because HDMI connectors can fit on PCBs.

https://forum.kicad.info/t/hdmi-pcb-edge-connector-for-raspb...

Never tried that with a DisplayPort connector.


I've never seen a single commercial device do that. I wouldn't be surprised if it isn't allowed.

That's not a reason to keep HDMI. Obviously the real reason is that HDMI makes money for the people selling it.


HDMI connectors are easier to plug in correctly the first time, and their cables usually flex more.

I always feel like those thick DisplayPort cables are going to rip the port off the PCB.

Even with HDMI it's an actual concern. There should be some rubber suspension on every port to allow for some torquing.


The cables flex more because it's lower bandwidth so there's less shielding.


There’s no reason DisplayPort couldn’t do the same thing. The connector on the inside is basically identical.

That being said, although creative, it’s a pretty terrible idea. PCBs are not meant to be edge connectors. Especially if they aren’t designed for it (ie using hard gold on the edge connector plating).


Sorry, I didn't know that PCB edge connectors were a terrible idea! I saw it on a PCIe slot before and thought it might work.

The other comment from @wtallis has a good point about electromagnetic interference (EMI). But I think it's shielded when it's plugged in, by the female connector.


To be clear: PCBs are acceptable as edge connectors assuming they’re designed for it, but unless you use the proper plating the number of insertions from a normal PCB plating (ENIG) is likely in the low double digits (10-20 times).


For those applications wouldn't MiniDP work just as well? Or DP-over-USB C?


Oh, that's a neat edge connector idea, I've only seen this done with USB before.


Yeah, that's definitely never going to pass EMI testing.


There’s an interesting middle-ground worth mentioning: a 2-in-1 connector https://www.mouser.com/ProductDetail/Rego-Electronics/845-00... that can take both HDMI and DisplayPort cables, which could be a neat solution for devices juggling both standards.

Here’s a video https://www.youtube.com/watch?v=rZpHizpZSPQ showing a PC with this connector in action.

Maybe not a game-changer, but it’s an interesting idea to reduce port clutter without forcing a winner in the format war just yet.


linux support? none? ok i'll pass


Mandate QMS and fix it.

It’s 2024. Why do TVs flash and stutter switching modes? It’s pathetic. Quick Media Switching partially fixes this. It makes TVs switch their frame rates fast, so you can go from 24 to 30 to 60 without noticing it. Nice right?

But that’s it. Need to switch resolution or go from normal to HDR? QMS doesn’t support that. Back to black screens.

Come on. This should be table stakes.

I didn’t see it mentioned in the article at all though.


The tech industry bodies need to look in the mirror at themselves rather ferociously. These usb, smart, hdmi bodies are failing the consumers, failing sustainable practices and just smell like poorly run bodies.


Curious about the length limitations.

DisplayPort and Thunderbolt are starting to make lengthier runs (7+ ft) tricky, which is still within the realm of cable management, particularly with hybrid standing/sitting desks.


Same. It's already tough finding a cable with enough length to go from from my floor-standing PC through my cable-managed standing desk and monitor arm.


There will probably be long (and expensive) fiber optic cables eventually, but yeah the basic copper cables are likely to have very limited reach.


They already exist. I recently bought a 33 foot fiber DP1.4+USB3.1/2 (depending on how much video bandwidth you’re using) cable on Amazon for about $130.


I meant they'll eventually exist for HDMI 2.2, the current ones probably won't support the new higher bandwidth modes.


(For the people that don't measure the world in limbs: that is ~10 meters.)


If you need long runs, place a pair of SDI media converters in between. Spec is IIRC 100m for coax cable runs, longer if you run fiber.


It is sad that TV has barely been supproting HDMI 2.1, may be this is more of a Mediatek or the chip's problem. We only expect Sony TV in 2025 to have more than one HDMI 2.1 port. How long do we have to wait for HDMI 2.2?


I really prefer USB C over both DisplayPort and HDMI.

Reasons:

- Easy to hook up to modern laptop (all MacBooks) that only have USB-C outputs.

- This allows your TV to become a hub.

- I have invest in a lot of high quality USB C cables rather lots of differ types of cables all of which requires either hubs, or built-in converters.

Can you hook up a USB C display to a iPhone or Android phone directly yet?

My ask of TV manufacturers:

Modern TVs should have built in cameras/microphones (or the ability to pull in a cameras via USB and mount it on top) to enable video conferencing so that my chromecast/Google TV device can run get me into Zooms/Google Meets without any other devices.


Video through your USB-C port IS DisplayPort. It's just a different physical connector.

(Yes, I realize that there's a video-over-USB thing with a similar name, but that's old heavily-compressed niche junk.)


Sounds like a hack to me. Does the USB-C protocol allow the use of another protocol over the same connector?


Yes. The type C connector includes a couple of reassignable differential pairs explicitly for these kinds of shenanigans.


Thanks, that's good to know. Where do you learn about these things without reading 500 pages of specifications?



Yes, Alternate Mode[0] lets you run more or less whatever you like over those pins.

[0] https://www.cablechick.com.au/blog/definitive-guide-to-usb-c...


hilarious that the “protocol” has the word Port in it. Could it be any more confusing? especially when something named “DisplayPort” is working via a USB-C port.

what a mess. it’s like a parody skit…imagine how a layman would interpret all this.


> Can you hook up a USB C display to a iPhone or Android phone directly yet?

iPhone: yes.

https://support.apple.com/en-us/105099#:~:text=USB-C%20displ...


Isn’t video over USB-C still DisplayPort, just encapsulated?


Usually it's raw DisplayPort connection using some of the SuperSpeed lanes on USB3/USB-C connector. Encapsulation is optionally available when running Thunderbolt/USB4 on the port, then between consenting devices you can setup a virtal circuit where DisplayPort packets are encapsulated in Thunderbolt/USB4 packets.


Man, this is a sad answer in this forum.

Do you know anything about USB C and why there aren't any cables over 3m (10ft)?

And why these are 100$+ cables?

(Usb4/Tb3+ cables, capable of 40gbps like a hdmi2.1)


> Can you hook up a USB C display to a iPhone or Android phone directly yet?

Your phone should have a video output inside the usb-c port.


HDMI 2.1 is useless already, how many monitors halve the bandwidth of their 2.1 connector? Standards need enforcing, none of this fake HDMI 2.x bullshit.


HDMI Forum to AMD: No, you can’t make an open source HDMI 2.1 driver

https://arstechnica.com/gadgets/2024/02/hdmi-forum-to-amd-no...


This is such an annoying limitation. With monitors it’s one thing, as DisplayPort is normally an option as well (though with an iGPU maybe not), but if you want to display on a high end TV? You need to go find an active DisplayPort -> HDMI converter that supports all the features you need.

From the bug report [0], there was an update 5 months ago that discussions (at AMD) were still ongoing.

[0] - https://gitlab.freedesktop.org/drm/amd/-/issues/1417


The point of HDMI port is that access to it is controlled by the consortium, and that it includes DRM.

This isn't necessarily compatible with FOSS.


Well, I'm glad that BigContent hasn't taken control of our CPUs yet.


Guess what considerable portion of Intel ME is for, or why normal AMD CPUs have closed source PSP...


It's not really the DRM part, but the fact that someone could avoid paying $15,000 by reading some publicly-available source code instead of joining the HDMI Cabal.


No, that's the point of DisplayPort in being royalty-free and unencumbered.

HDMI was deeply intertwined with the brave new world of Protected Media Path.


I'd love to see the driver include an error message with the personal cellphone number of whoever it was at the HDMI Forum who made that decision.

"For further information, call these people."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: