> I wish HDMI in general would die and let displayport take it place.
That would be an interesting thing for the video production industry. Basically the entire market is divided between "professional" equipment, in which SDI continues to dominate (with some movement towards SMPTE 2110 aka IP), and "amateur" equipment which is all HDMI - with very few products on the boundary and supporting both connectors.
Consumer grade digital cameras have only recently (10 years, maybe less) started being able to output the live video feed over HDMI. Before that, believe me or not I've stumbled upon MANY camera models from 2013 or before that had an HDMI port, but all it was good for was displaying pictures from the memory card on a TV.
It would certainly make life easier in a lot of "small streamer" setups. Currently, if you don't want to torture your camera's battery, you need to get a silly (often third-party) "dummy battery" that you can (hopefully!) plug into an ordinary USB power supply; and on top of that, a separate mini/micro HDMI -> full HDMI cable to plug into a capture card. If you could reduce that to a single USB-C with PD and DP - trust me, every silly cable you can eliminate from your setup is an enormous win.
Even better, if these cameras could talk the regular USB "webcam" protocol in addition to DP, eliminating the capture card for the overwhelmingly common setup of "I just want to look very good on video calls".
But that opens a can of worms: in any non-trivial setup, a camera (one camera) is merely a small piece of a much more elaborate puzzle. Even seemingly simple setups end up converting the signal back&forth between some crazy stuff. On one job, we needed to run an SDI or HDMI cable between the floors, but couldn't do either because the building was untouchable; so we've used a couple of HDMI -> HDBaseT converters to run the signal over existing Ethernet cables. Turns out, it was no longer possible to convert the resulting HDMI signal again to SDI (we've tried many converters, all failed), which limited our choice of video mixers. Would the signal make it through if it originated as DP? Your guess is as good as mine.
Broadcast is a strange place. I still laugh whenever I think of Quad-SDI; only the broadcast industry could ever come up with that. Things need to work with one another and even if every single person in the world agreed that HDMI must die, starting today, I'm fairly certain we'd still see new equipment being made in 2033 that supports it.
I have spent the last few months doing a deep dive on broadcast / audio engineer standards. The lack of reliability and strange standards are interesting...
It seems like the last few standards started really robust and open because of the lack of compatibility, and then greed got involved and vendors just slipped in something to make it difficult cross connect. I assume so people would have to buy more of their stuff.
The focus on "realtime" makes the standards have worse quality in practice (bad handling of dropped or bad bits), and makes it much harder for the IP based standards to be routed (network congestion from high bitrate through uplinks). WebRTC by comparison can be quite nice.
I seriously don't have any hope for sanity in that market.
Consistently good colors. Sometimes even PC monitors won't negotiate properly with the PC and start displaying washed out colors. IIRC this had to be forced on a Raspberry Pi on the PI side. Not all monitors can be adjusted.
Consistently no under/overscan. For some reason, the TVs we had in conference rooms figured it would be smart to cut the borders of the image and zoom in, so you get missing bits and whatever's left is a blurry mess.
For the TV situation, you usually don't have a full remote to adjust it, if it even supports that. You often don't have time to look up things in the TV's typically crappy menu system. I've usually found the option to disable over scan, but using full-range RGB seems less common.
Also, HDMI seems to lag DisplayPort capabilities when it comes to higher resolutions and refresh rates. When my 2013 MBP came out, it could drive a 4k@60 screen over DP. HDMI required the 2.0 version to do that, which, IIRC, came much later.
>Also, HDMI seems to lag DisplayPort capabilities when it comes to higher resolutions and refresh rates.
I think that one very much depends on when you chose to look at it. HDMI 2.1 has more bandwidth than DisplayPort 1.4, enough to do 4k@120 which DP1.4 wouldn't be able to do it without dropping color down to 4:2:0. DisplayPort 2.0 devices are starting to come out, but even Nvidia's RTX 4000 series still do not have DisplayPort 2.0 (but do have HDMI 2.1). While TV started supporting HDMI 2.1 around 2019, with the PS5 and Xbox Series X having HDMI 2.1 ports.
So while DisplayPort may be ahead now with DisplayPort 2.0, HDMI was ahead for at least 4 years with HDMI 2.1
That's a fair point. I admit I was judging by the availability on the PC side (I don't follow the console market).
Although, if I'm not mistaken, my particular PC monitor initially didn't support HDMI 2.0, even though it's a 4k panel. There was a further revision which included it. I have that revision, but support is still somewhat wonky, in that it can't seem to switch on its own from 2.0 to 1.4b.
Display support is always last in this chicken-and-egg problem. As far as I know, there's still no DisplayPort 2.0 supporting displays, so any of the higher end monitors require display stream compression. I can't even get a DisplayPort 2.0 MST hub so I can chain multiple 1440p@144 monitors. Which is definitely something that HDMI can't do.
Ok so that sounds like DP has better negotiation protocol spec and/or implementations.
I've only encountered overscan on TVs not monitors, but I don't give conference room presentations, which would be very annoying. On Macs there compensation for that.
I've had trouble with colors where it uses YPbPr rather than RGB, but that seems to be an Apple thing where it's done on purpose for non-Apple-approved displays and it happens for both HDMI and DP. Generating a custom EDID profile fixes that. Can't recall having trouble with color range, sometimes the display has a setting but the default always looked better to me.
I've used 4k@60 HDMI just fine (and 4k@30 on an early Apple adapter), but more often use 1080p anyway. I use USB-C with my 4k displays which likely runs DP on them.
> Ok so that sounds like DP has better negotiation protocol spec and/or implementations.
Right. So... DP is better than HDMI?
The color issue I had was not with a Mac but with an HP laptop on an HP monitor. My understanding is that there's something about "broadcast colors" or something, which is "regular" RGB only with a narrower range. I think the PC thought the monitor was a TV with a limited range, which the monitor was not.
With the TVs I plug into, it's usually harder to judge since their color rendition tends to be all over the place anyway and tend to have the reverse issue (too much contrast).
I remember the overscan control on the mac, but it still was a PITA to have to fiddle with that instead of, you know, just plugging the screen in and being in business.
While I've also had numerous positive experiences with HDMI where things seemingly "just worked", I've never had an issue with DP. It always worked. Hell, even my gaming GPU, which came out a while after HDMI 2, and supported it out of the box, connected to my monitor with full HDMI 2 support, still has weird colors compared to DP. No tweaking in the AMD drivers managed to get me the proper output, so I went and bought a cheap Chinese DP KVM instead. Which worked with no fuss.
All this makes me automatically pick DP if given the choice, and discount any computer or screen that only does HDMI. Which makes it pretty tough to buy a TV, so I just watch movies on my computer monitor.
Yeah totally agree - I’ve had quite a few situations where a monitor or tv looks blurry and washed out with HDMI and it’s immediately fixed with a DP cable.