Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Does AMD not support Display Port? I'm not an expert on this, but that sounds to me like the superior technology.




TVs don't support displayport, so it makes Linux PCs like the Steam Machine inferior console replacements if you want high refresh rates. A lot of TVs now support 4K/120hz with VRR, the PS5 and Xbox Series X also support those modes.

(Some games support 120, but it's also used to present a 40hz image in a 120hz container to improve input latency for games that can't hit 60 at high graphics quality.)


Why don't TVs support displayport? If HDMI 2.1 support is limited, a TV with displayport sounds like an obvious choice.

I thought audio might be the reason, for as far as I can tell, displayport supports that too.


Legacy is a bitch.

It took a long time to move from the old component input over to HDMI. The main thing that drove it was the SD to HD change. You needed HDMI to do 1080p (I believe, IDK that component ever supported that high of a resolution).

Moving from HDMI to display port is going to be the same issue. People already have all their favorite HDMI devices plugged in and setup for their TVs.

You need a feature that people want which HDMI isn't or can't provide in order to incentivize a switch.

For example, perhaps display port could offer something like power delivery. That could allow things like media sticks to be solely powered by the TV eliminating some cable management.


The legacy issue is even worse than that. I have a very new Onkyo RZ30 receiver and it is all HDMI with no DisplayPort to be seen. So it is the whole ecosystem including the TV that would need to switch to DP support.

> For example, perhaps display port could offer something like power delivery.

It already does. A guaranteed minimum of 1.65W at 3.3V is to be provided. Until very recently, HDMI only provided a guaranteed minimum of something like 0.25W at 5V.


It's not nothing, but it's also very little to play with.

5W is what I'd think is about the minimum for doing something useful. 25W would actually be usable by a large swath of devices. The raspberry pi 4, for example, has a 10W requirement. Amazon's fire stick has ~5W requirement.


> It's not nothing, but it's also very little to play with.

Sure. But it's ~6.6x more than what HDMI has historically guaranteed. It's pretty obvious to anyone with two neurons to spark together that the problem here isn't "amount of power you can suck out of the display port". If it were, DP would have swept away HDMI ages ago.


> It's pretty obvious to anyone with two neurons to spark together that the problem here isn't "amount of power you can suck out of the display port".

Nobody said it was.

I gave that out as and example of a feature that DP might adopt in order to sway TV manufacturers and media device manufactures to adopt it.

But not for nothing, 0.25W and 1.67W are virtually the same thing in terms of application. Just because it's "6.6x more" doesn't mean that it's usable. 0.25W is 25x more than 0.01W, that doesn't make it practically usable for anything related to media.


> But not for nothing, 0.25W and 1.67W are virtually the same thing in terms of application.

You really can't power an HDMI (or DisplayPort) active cable on 0.25W. You can on 1.67W. This is why in mid-June 2025 the HDMI consortium increased the guaranteed power to 1.5W at 5V. [0] It looks pretty bad when active DP cables (and fiber-optic DP cables) never require external power to function, but (depending on what you plug it into) the HDMI version of the same thing does.

> Nobody said it was.

You implied that it was in a bit of sophistry that's the same class as the US Federal Government saying "Of course States' compliance with this new Federal regulation is completely voluntary: we cannot legally require them to comply. However, we will be withholding vital Federal funds from those States that refuse to comply. As anyone can plainly see, their compliance is completely voluntary!".

DP 1.4 could have offered 4kW over its connector and TVs would still be using HDMI. Just as Intel and Microsoft ensured the decades-long reign of Wintel prebuilt machines [1], it's consortium that controls the HDMI standard that's actively standing in the way of DP deploying in the "home theater".

[0] "HDMI 2.1b, Amendment 1 adds a new feature: HDMI Cable Power. With this feature, active HDMI® Cables can now be powered directly from the HDMI Connector, without attaching a separate power cable." from: <https://web.archive.org/web/20250625155950/https://www.hdmi....>

[1] The Intel part is the truly loathsome part. I care a fair bit less about Microsoft's dirty dealings here.


> You implied that it was in a bit of sophistry that's the same class as the US Federal Government saying "Of course States' compliance with this new Federal regulation is completely voluntary

This is a very bad faith interpretation of my comment. I did not imply it and I'm not trying to use CIA tricks to make people implement it as a feature.

Are you upset that I gave an example?


Sophistry might have been considered a CIA-grade trick ~2,500 years ago, but it's pretty well known by now.

I think it's not really an issue for 95-99% of users who uses devices with non open source drivers so there is no incentive for manufacturers to add it?

Tell Valve that it isn't an issue. They have built in hardware support for HDMI 2.1 on the new Steam Machine but can't support it in software.

> Why don't TVs support displayport?

For the same sorts of reasons that made it so for decades nearly every prebuilt PC shipped with an Intel CPU and Windows preinstalled: dirty backroom dealings. But in this case, the consortium that controls HDMI are the ones doing the dealings, rather than Intel and Microsoft.

"But Displayport doesn't implement the TV-control protocols that I use!", you say. That's totally correct, but DisplayPort has the out-of-band control channel needed to implement that stuff. If there had been any real chance of getting DisplayPort on mainstream TVs, then you'd see those protocols in the DisplayPort standard, too. As it stands now, why bother supporting something that will never, ever get used?

Also, DP -> HDMI active adapters exist. HDR is said to work all the time, and VRR often works, but it depends on the specifics of the display.


Correction, you can get 4K@120hz with HDMI 2.0, but you won't get full chroma 4:4:4, instead 4:2:0 will be forced.

In my case I have an htpc running linux and a radeon 6600 connected via hdmi to a 4k @ 120hz capable tv, and honestly, at the sitting distance/tv size and using 2x dpi scaling you just can't tell any chroma sub-sampling is happening. It is of course a ginormous problem when on a desktop setting and even worse if you try using 1x dpi scaling.

What you will lose however is the newer forms of VRR, and it may be unstable with lots of dropouts.


Do consoles support anything above 60 FPS?

My PS5 can do 4k/120 hz with VRR support, not sure about the others.

I'm bit puzzled, isn't VRR more for low powered hardware to consume less battery (handhelds like steam deck)? How does it fit hardware that is constantly connected to power?

(I assume VRR = Variable Refresh Rate)


Variable refresh rate is nice when your refresh rate doesn't match your output. Especially when you're getting into higher refresh rates. So if your display is running at 120hz, but you're only outputting 100hz: you cannot fit 100 frames evenly into 120 frames. 1/6 of your frames will have to be repeats of other frames, and in an inconsistent manner. Usually called judder.

Most TVs will not let you set the refresh rate to 100hz. Even if my computer could run a game at 100hz, without VRR, my choices are either lots of judder, or lowering it to 60hz. That's a wide range of possible refresh rates you're missing out on.

V-Sync and console games will do this too at 60hz. If you can't reach 60hz, cap the game at 30hz to prevent judder that would come from anything in between 31-59. The Steam Deck actually does not support VRR. Instead the actual display driver does support anything from 40-60hz.

This is also sometimes an issue with movies filmed at 24hz on 60hz displays too: https://www.rtings.com/tv/tests/motion/24p


It reduces screen tearing without adding all the latency that vsync introduces.

VRR is necessary to avoid tearing or FPS caps (V-sync) when your hardware cannot stably output FPS count matching the screen refresh rate.

Are there games running at 4k 120hz?

Call of Duty and Battlefield both run at 4K@120 with dynamic resolution scaling, PSSR or FSR.

Most single player games (Spider-Man, God of War, Assassin's Creed etc) will allow a balanced graphics/performance which does 40 in a 120hz refresh.


Full 4k - very few, but lots are running adaptive resolutions at > 2k and at 120hz

Touryst renders the game at 4K120 or 8k60. In the latter case, the image is subsampled to 4K output.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: