I'd say there are two remaining roadblocks. First and biggest is kernel level anti-cheat frameworks as you point out. But there's also no open source HDMI 2.1 implementation allowed by the HDMI cartel so people like me with an AMD card max out at 4K60 even for open source games like Visual Pinball (unless you count an adapter with hacked firmware between the card and the display). NVidia and Intel get away with it because they implement the functionality in their closed source blobs.
This is kind of a niche problem. It only affects people with AMD GPUs running games at over 4k60 with HDMI. Get an NVidia or stay at 60 FPS or stay at 1080p or use DisplayPort and you will be fine.
It is not really a roadblock, more like a bump, and it is not the only bump by far. Some games just don't run on Linux, or quite terribly and they don't have a big enough community for people to care. Sometimes one of your pieces of hardware, maybe an exotic controller, doesn't like linux. Sometimes it is not the fault of the game at all, but you want to do something else with that PC and it isn't supported on Linux, and you don't want to dual boot. Overall, you will have less problems with gaming on Windows, especially if you don't really enjoy a trip to stackoverflow and the command line, but except for anti-cheat maybe, there is no "big" reasons, just a lot of small ones.
It's a blocker if you want to use a TV, there are almost 0 TVs with DP. This HDMI licensing crap is also the reason a Steam Deck can't output HDMI > 4K@60 unless you install Windows on it.
Yup this works but there's as of yet no HBR13.5 or better input so you're not getting full hdmi 2.1 equivalent. But if you don't care about 24 bits per pixel DSC then you can have an otherwise flawless 4k120hz experience.
DP is something like a free superset of HDMI, so you can use a fully passive DP-HDMI cable. Obviously the feature set will be limited, but it will work.
DP however can't transfer audio, which doesn't matter for a desktop but matters a lot for a TV.
> DP is something like a free superset of HDMI, so you can use a fully passive DP-HDMI cable.
No, it's not, the protocol is completely different (DP is packet-based while HDMI traditionally was not, though AFAIK HDMI 2.1 copied DP's approach for its higher speed modes). When you use a passive DP-HDMI cable (which AFAIK is not fully passive, it has level shifters since the voltages are different), it works only because the graphics card detects it and switches to using the HDMI protocol on that port; if it's not a dual-mode port (aka "DP++" port) it won't work and you'll need an active DP-HDMI adapter.
> DP however can't transfer audio, which doesn't matter for a desktop but matters a lot for a TV.
On the desktop I'm using to type this message, I use the speakers built into the DP-connected monitor (a Dell E2222HS). So yes, DP can and does transfer audio just fine. If it couldn't, then active DP to HDMI adapters wouldn't be able to transfer audio too.
The only thing DP doesn't have AFAIK is ARC, which might matter for a few more exotic TV use cases, and HEC, which AFAIK nobody uses.
If you have a TV with low latency for gaming, 4K, and 120+hz, then you have a really expensive TV, and you likely care about quality. I'd reckon most of this popultion also owns a separate monitor for PC gaming.
Up until a year or two ago, the majority of monitors (and graphic cards) used DisplayPort 1.4 and HDMI 2.1. With HDMI 2.1 (42 Gbps) having more bandwidth than the DisplayPort (26 Gbps).
This is my case with my relatively new/high-end RTX 4080 and OLED monitor. So until I upgrade both, I use HDMI to be able to drive a 1440p 240hz 10-bit HDR signal @ 30 Gbps.
I had said I wouldn’t upgrade from my RTX 3080 until I could run “true 4K”.
I finally got the 240hz 4K uncompressed but it required buying a $1300 Asus OLED monitor and the RTX 5090. It looks amazing though, even with frame gen. Monster Hunter had some particularly breathtaking HDR scenes. I think it uses DisplayPort 2.1? Even finding the cable is difficult, Microcenter didn’t have them in April and the only one that worked was the one that came with the monitor.
This is the first I learned of this since personally I have no need of anything over 4k@60 (that already borders on absurd in my mind). I'm curious if this is something that's likely to get reverse engineered by the community at large?
Outrageous that a ubiquitous connection protocol is allowed to be encumbered in this way.
For the particular use case I mentioned in my earlier post (Visual Pinball), 4k@120 is actually a pretty big deal. We often play on screens 42" and up so the 4k detail is put to good use and makes things like the instruction cards in the corners legible. But the bigger difference is the smoothness in gameplay that 120Hz gets you. The ball travels really fast so 120 Hz helps gameplay a lot while reducing lag at the same time. And because a large chunk of the playfield is static at any one time, you don't need something like a 5090 to hit 120 Hz at that resolution like you might with a triple-A shooter.
TVs don't support displayport, so it makes Linux PCs like the Steam Machine inferior console replacements if you want high refresh rates. A lot of TVs now support 4K/120hz with VRR, the PS5 and Xbox Series X also support those modes.
(Some games support 120, but it's also used to present a 40hz image in a 120hz container to improve input latency for games that can't hit 60 at high graphics quality.)
It took a long time to move from the old component input over to HDMI. The main thing that drove it was the SD to HD change. You needed HDMI to do 1080p (I believe, IDK that component ever supported that high of a resolution).
Moving from HDMI to display port is going to be the same issue. People already have all their favorite HDMI devices plugged in and setup for their TVs.
You need a feature that people want which HDMI isn't or can't provide in order to incentivize a switch.
For example, perhaps display port could offer something like power delivery. That could allow things like media sticks to be solely powered by the TV eliminating some cable management.
The legacy issue is even worse than that. I have a very new Onkyo RZ30 receiver and it is all HDMI with no DisplayPort to be seen. So it is the whole ecosystem including the TV that would need to switch to DP support.
> For example, perhaps display port could offer something like power delivery.
It already does. A guaranteed minimum of 1.65W at 3.3V is to be provided. Until very recently, HDMI only provided a guaranteed minimum of something like 0.25W at 5V.
It's not nothing, but it's also very little to play with.
5W is what I'd think is about the minimum for doing something useful. 25W would actually be usable by a large swath of devices. The raspberry pi 4, for example, has a 10W requirement. Amazon's fire stick has ~5W requirement.
> It's not nothing, but it's also very little to play with.
Sure. But it's ~6.6x more than what HDMI has historically guaranteed. It's pretty obvious to anyone with two neurons to spark together that the problem here isn't "amount of power you can suck out of the display port". If it were, DP would have swept away HDMI ages ago.
> It's pretty obvious to anyone with two neurons to spark together that the problem here isn't "amount of power you can suck out of the display port".
Nobody said it was.
I gave that out as and example of a feature that DP might adopt in order to sway TV manufacturers and media device manufactures to adopt it.
But not for nothing, 0.25W and 1.67W are virtually the same thing in terms of application. Just because it's "6.6x more" doesn't mean that it's usable. 0.25W is 25x more than 0.01W, that doesn't make it practically usable for anything related to media.
> But not for nothing, 0.25W and 1.67W are virtually the same thing in terms of application.
You really can't power an HDMI (or DisplayPort) active cable on 0.25W. You can on 1.67W. This is why in mid-June 2025 the HDMI consortium increased the guaranteed power to 1.5W at 5V. [0] It looks pretty bad when active DP cables (and fiber-optic DP cables) never require external power to function, but (depending on what you plug it into) the HDMI version of the same thing does.
> Nobody said it was.
You implied that it was in a bit of sophistry that's the same class as the US Federal Government saying "Of course States' compliance with this new Federal regulation is completely voluntary: we cannot legally require them to comply. However, we will be withholding vital Federal funds from those States that refuse to comply. As anyone can plainly see, their compliance is completely voluntary!".
DP 1.4 could have offered 4kW over its connector and TVs would still be using HDMI. Just as Intel and Microsoft ensured the decades-long reign of Wintel prebuilt machines [1], it's consortium that controls the HDMI standard that's actively standing in the way of DP deploying in the "home theater".
[0] "HDMI 2.1b, Amendment 1 adds a new feature: HDMI Cable Power. With this feature, active HDMI® Cables can now be powered directly from the HDMI Connector, without attaching a separate power cable." from: <https://web.archive.org/web/20250625155950/https://www.hdmi....>
[1] The Intel part is the truly loathsome part. I care a fair bit less about Microsoft's dirty dealings here.
> You implied that it was in a bit of sophistry that's the same class as the US Federal Government saying "Of course States' compliance with this new Federal regulation is completely voluntary
This is a very bad faith interpretation of my comment. I did not imply it and I'm not trying to use CIA tricks to make people implement it as a feature.
I think it's not really an issue for 95-99% of users who uses devices with non open source drivers so there is no incentive for manufacturers to add it?
For the same sorts of reasons that made it so for decades nearly every prebuilt PC shipped with an Intel CPU and Windows preinstalled: dirty backroom dealings. But in this case, the consortium that controls HDMI are the ones doing the dealings, rather than Intel and Microsoft.
"But Displayport doesn't implement the TV-control protocols that I use!", you say. That's totally correct, but DisplayPort has the out-of-band control channel needed to implement that stuff. If there had been any real chance of getting DisplayPort on mainstream TVs, then you'd see those protocols in the DisplayPort standard, too. As it stands now, why bother supporting something that will never, ever get used?
Also, DP -> HDMI active adapters exist. HDR is said to work all the time, and VRR often works, but it depends on the specifics of the display.
Correction, you can get 4K@120hz with HDMI 2.0, but you won't get full chroma 4:4:4, instead 4:2:0 will be forced.
In my case I have an htpc running linux and a radeon 6600 connected via hdmi to a 4k @ 120hz capable tv, and honestly, at the sitting distance/tv size and using 2x dpi scaling you just can't tell any chroma sub-sampling is happening. It is of course a ginormous problem when on a desktop setting and even worse if you try using 1x dpi scaling.
What you will lose however is the newer forms of VRR, and it may be unstable with lots of dropouts.
I'm bit puzzled, isn't VRR more for low powered hardware to consume less battery (handhelds like steam deck)? How does it fit hardware that is constantly connected to power?
Variable refresh rate is nice when your refresh rate doesn't match your output. Especially when you're getting into higher refresh rates. So if your display is running at 120hz, but you're only outputting 100hz: you cannot fit 100 frames evenly into 120 frames. 1/6 of your frames will have to be repeats of other frames, and in an inconsistent manner. Usually called judder.
Most TVs will not let you set the refresh rate to 100hz. Even if my computer could run a game at 100hz, without VRR, my choices are either lots of judder, or lowering it to 60hz. That's a wide range of possible refresh rates you're missing out on.
V-Sync and console games will do this too at 60hz. If you can't reach 60hz, cap the game at 30hz to prevent judder that would come from anything in between 31-59. The Steam Deck actually does not support VRR. Instead the actual display driver does support anything from 40-60hz.