this post was submitted on 19 May 2024
124 points (93.7% liked)

Technology

34828 readers
16 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] jsomae@lemmy.ml 4 points 5 months ago (1 children)

Yeah I agree with you, but I was just refuting your claim that it's not perceivable even if you try.

[โ€“] fushuan@lemm.ee 0 points 5 months ago

oh, yeah I've read and heard of plenty people saying that they definitely notice it. I'm lucky enough not to because most ARPGs don't run 60FPS on intense combat, let alone 120 fps on a rtx3080 lmao.

I was talking more about the jump from 240 and beyond, which I find surprising for people to notice the upgrade on intense gaming encounters, not while calmly checking or testing. I guess that there's people who do notice, but again, running games on such high tick rate is very expensive for the gpu and a waste most of the time.

I'm just kinda butthurt that people feel like screens below 120 are bad, when most games I play hardly run 60 fps smooth, because the market will follow and in some years we will hardly have what I consider normal monitors, and the cards will just eat way more electricity for very small gains.