I'd say not even the highest end displays. I'm using HDMI 2.1 on my RTX 4080 because it has more bandwidth than the DisplayPort 1.4 ports on this graphic card.
It's a similar port selection on even the highest end monitors: 40 Gigabit/s HDMI 2.1, or 26 Gigabit DP1.4. I'm unable to find a single DP2.0 monitor.
Sure, but if we're talking about high-end displays, 4K by itself isn't exactly high-end anymore. A 4 year old Playstation 5 can output not just 4K, but also 120hz with 10-bit HDR colour. This requires ~32 Gigabit/s, more than DP1.4
Well, yes. Another part that prevented more rapid adoption of DP2.0 other than longer and longer upgrade cycle of GPUs and displays is that another option is to just use 2 cables for 51 Gbit/s which was also a thing with early 4k displays.
It requires compatible GPUs, but it's how early 5k@60Hz displays were connected using DisplayPort 1.2, and how very large displays are connected in general. Among displays that used this was for example Dell UP2715K and early LG UltraFine 5k, but I heard that Apple's XDR display uses it as well to avoid utilizing DSC by setting up 2 DisplayPort 1.4 tunnels over TB3 and using tiled mode.
The display device provides DisplayID block id 0x28 "Tiled Display Technology" with description of tiling, which is then used on GPU side to properly configure framebuffer -> DP sources mappings so that from userland you see one display.
The DisplayID block also explicitly supports things like declaring bevels etc. if necessary.
It's a similar port selection on even the highest end monitors: 40 Gigabit/s HDMI 2.1, or 26 Gigabit DP1.4. I'm unable to find a single DP2.0 monitor.