this post was submitted on 26 Jun 2025
78 points (96.4% liked)

PC Gaming

11559 readers
128 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
all 31 comments
sorted by: hot top controversial new old
[–] kbal@fedia.io 31 points 5 days ago (1 children)

I hate HDMI with a passion that can not be explained.

[–] Thedogdrinkscoffee@lemmy.ca 5 points 5 days ago (3 children)
[–] taaz@biglemmowski.win 60 points 5 days ago (2 children)

Well one very good reason would be that their specification is closed source and as such not even HDMI Forum partner AMD can implement them in their open source driver.

https://www.phoronix.com/news/HDMI-Closed-Spec-Hurts-Open

DisplayPort spec is fully open btw.

[–] who@feddit.org 12 points 4 days ago

Folks here might also find this article interesting:

DisplayPort: A Better Video Interface

[–] Thedogdrinkscoffee@lemmy.ca 4 points 4 days ago

Thanks for a great answer.

[–] kbal@fedia.io 24 points 5 days ago (1 children)

I don't know, there's just something about it.

For a long time we had VGA for video cables. There was no VGA version 2.1.9, now supporting 1024x768 mode with 16-bit colour. Cables did not cost $29. There were no rent-seeking patent holders charging license fees, or at least they weren't obnoxious enough that we knew about them. It didn't have five different types of connectors. There was no VGA consortium constantly keeping itself in the news with periodic press releases. Companies didn't need to sign away their soul to write drivers for it. There was no VGA copy protection trying to keep us from decoding our own video streams. Cables didn't include enough microelectronics to power a space shuttle.

Somehow I think we could do better.

[–] Thedogdrinkscoffee@lemmy.ca 1 points 4 days ago

I get it now. Thanks.

[–] passepartout@feddit.org 7 points 5 days ago

Compared to Displayport, here is a Techquickie (LTT) Video about it.

[–] jinwk00@sh.itjust.works 28 points 5 days ago

DisplayPort over HDMI!!

[–] FeelzGoodMan420@eviltoast.org 9 points 4 days ago* (last edited 4 days ago)

Remind me in like 6+ years when the standard is actually widely adopted. Many high-end OLED monitors today in the year 2025 still ship with fucking hdmi 2.0 and displayport 1.4.

[–] Fenrisulfir@lemmy.ca 12 points 5 days ago

Great. What’s the max length? 6”?

[–] Sibbo@sopuli.xyz 12 points 5 days ago (3 children)

After 1080p60 I kind of still notice a difference, but I'm not willing to pay much more for increasing that further.

[–] passepartout@feddit.org 14 points 5 days ago (1 children)

1080p60 is/was the norm for a long time. 1440p144 is the current sweetspot for desktop/gaming I suppose.

[–] Petter1@lemm.ee 4 points 5 days ago (1 children)

I personally prefer 4k 60 (of course, high hz is better)

I ajust zoom level according screen size, on 32” 4k, I have it mostly around 125% zoom

On my 14” i have, i think, 2550 which as well looks amazing and allows to be usable at 125% as well

I adapt zoom level according on what I am doing, I like having options to go tiny icons and a lot of space or have it big if I am e.g. in a meeting and have to adjust screen distance to be normally visible by the webcam

[–] passepartout@feddit.org 4 points 5 days ago (1 children)

I meant to say 1440p144 is as a sweet spot concerning price performance ratio imho. The rest of the hardware, especially the GPU have to be considered as well.

Even on a 1440p 27" LCD I zoom in to about 133%, mostly for the viewing experience of the people I share my screen with.

I'd love an OLED with the same specs, but they are still to expensive to potentially suffer from burn in some time.

[–] Petter1@lemm.ee 3 points 5 days ago (1 children)

Yea, I always only share a window, so that I not have to change resolution or zoom while in meetings, very valid point

And as well for pricing, I agree (especially if you consider that you need more GPU power for more pixel), but if you are not too picky, you can get a 4k60 screen at sub 300$

1440 27” and 133% seems to me, like you have not much space to have multiple app beside each other?

I most likely sit more near the screen than most people, that may be the reason for my preferences 😄

[–] passepartout@feddit.org 2 points 5 days ago

My work environment is chaotic enough for me to have to cycle through 4 different instances of VSCode, terminals and Firefox, while simultaneously doing tech support for windows issues. I'd have switched to Linux if it wasn't for the last bit.

I work on a 14" Laptop with 1080p60 that is the second display, while i use the 27" 1440p as the main one. I use a USB C dongle to connect and can therefore can only get 60hz because the screen will flicker otherwise (though on Linux the dongle works even for 144hz, which is above the dongle rating of 120hz, but I digress).

I'm a bit constrained with the available space, so I use only my Laptop + screen for work and only the single screen for my personal rig, which is kind of a bummer. Will opt for a 4k ~120hz ~40-50" OLED TV for my next second "monitor" though (:

I mostly want displays to not be something I worry about. Even if I just have a single port, being able to connect 3 4K monitors without worrying about their refresh rate is convenient.

[–] Petter1@lemm.ee 3 points 5 days ago (2 children)

I need 4k to be happy, with 1080, you have giant windows in you OS (like most apps are only usable in fullscreen) even on 100% and still see single pixels so well…

Straight unusable for me, maybe on a phone with max 5” there 1080 is like a good middle ground (battery vs resolution vs not seeing single pixels)

[–] moonlight@fedia.io 3 points 5 days ago (1 children)

Yeah, 1080p is fine on a small laptop screen, or a small TV on the other side of the room, but it's unusable for desktop applications. Even 1440 is noticeably low res. I disagree about phones, though. I think 1080p is overkill and 720p is fine.

[–] Petter1@lemm.ee 1 points 4 days ago

I think, with phones, it is very important to factor the size of screen in

720p is fine, but with 7”+ phones, I think, one is happy about having similar DPI compared to the smaller 720p phones.

[–] alsimoneau@lemmy.ca 0 points 4 days ago (1 children)

You sound like you've never gamed at 240p

[–] Petter1@lemm.ee 1 points 4 days ago

Jokes on you 😁 my first game I ever played was on a 240 x 160 screen

(2.9”)

[–] some_guy@lemmy.sdf.org 10 points 5 days ago (1 children)

Of course you need a new cable. You're not getting massive upgrades in fidelity on your old crappy cable.

[–] MouldyCat@feddit.uk 7 points 4 days ago

But.. but.. it has gold-plated connectors 😟

[–] skisnow@lemmy.ca 9 points 5 days ago (1 children)

Is the 480Hz support "just because", or is there any kind of use case for it?

[–] noride@lemmy.zip 14 points 5 days ago (1 children)

I think It's more like the bandwidth needed to support 12k at 120hz also allows for 4k at 480hz, soo... por que no los dos?

[–] fubarx@lemmy.world 6 points 4 days ago

12K. brought to you by Hollywood Face Makeup and CGI alliance.