this post was submitted on 12 Sep 2023
548 points (100.0% liked)

Technology

37739 readers
600 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

I absolutely hate "smart" TVs! You can't even buy a quality "dumb" panel anymore. I can't convince the rest of my family and friends that the only things those smarts bring are built-in obsolescence, ads, and privacy issues.

I make it a point to NEVER connect my new 2022 LG C2 to the Internet, as any possible improvements from firmware updates will be overshadowed by garbage like ads in the UI, removal of existing features (warning: reddit link), privacy violations, possible attack vectors, non-existent security, and constant data breaches of the manufacturers that threaten to expose every bit of personal data that they suck up. Not to mention increased sluggishness after tons of unwanted "improvements" are stuffed into it over the years, as the chipset ages and can no longer cope.

I'd much rather spend a tenth of the price of my TV on a streaming box (Roku, Shield TV, etc.) and replace those after similar things happen to them in a few years. For example, the display of my OG 32-inch Sony Google TV from 2010 ($500) still works fine, but the OS has long been abandoned by both Sony and Google, and since 2015-16 even the basic things like YouTube and Chrome apps don't work anymore. Thank goodness I can set the HDMI port as default start-up, so I don't ever need to see the TV's native UI, and a new Roku Streaming Stick ($45) does just fine on this 720p panel. Plus, I'm not locked into the Roku ecosystem. If they begin (continue?) enshitifying their products, there are tons of other options available at similar price.

Most people don't replace their TVs every couple of years. Hell, my decade old 60-inch Sharp Aquos 1080p LCD TV that I bought for $2200 back in 2011 still works fine, and I only had to replace the streamer that's been driving it twice during all this time. Sony Google TV Box -> Nvidia Shield TV 2015 -> Nvidia Shield TV 2019. I plan to keep it in my basement until it dies completely before replacing it. The Shield TV goes to the LG C2 so that I never have to see LG's craptastic UI.

Sorry, just felt the need to vent. Would be very interested in reading community's opinions on this topic.

you are viewing a single comment's thread
view the rest of the comments
[–] GenderNeutralBro@lemmy.sdf.org 5 points 1 year ago (2 children)

However, the other solution is the one you’ve already mentioned where you never plug the Smart TV into the internet, and instead bypass the “smart” on the TV with your own streaming boxes.

I did this for a long time on my old Vizio TV, but the experience was notably worse with external devices compared to built-in, due to the limited framerate support over HDMI. This led to awkward juddering when e.g. trying to play 23.976fps movies with only 30hz or 60hz output. It also meant built-in video features like motion interpolation did not work effectively.

I guess this is less of an issue today with VRR support on high-end TVs, but still, a lot of devices you might connect to a TV don't support VRR.

[–] beefcat@beehaw.org 6 points 1 year ago* (last edited 1 year ago) (1 children)

Your streaming box was either not configured properly, or was very low cost.

The most likely solution is that you need to turn on a feature on your streaming box that sets the output refresh rate to match that of the content you are playing. On Apple TVs it is called "match frame rate". I know Rokus and Android TV devices have similar options.

Newer TVs can detect when 24 fps content is being delivered in a 60 hz signal and render it to the panel correctly, but this doesn't usually work if you have the selected input set to any low-latency modes ("Game", "PC", etc)

Good to hear newer devices support this.

My experience was from quite a few years ago (2015ish). At that time, there was no such feature in any of the devices I tried connecting, including a few brands of Android phones, Fire TV sticks, and MacBooks. I remember reading into the documentation on other devices at the time to find something better, with no luck. That said, documentation was pretty poor all around so who knows? The most useful info I found was in threads on VideoHelp or AVS forums where other users reported similar issues on various devices. Android TV was still very new and very shitty back then.

At this point I would simply not buy anything that doesn't support VRR.

[–] TJmCAwesome@feddit.nu 4 points 1 year ago (2 children)

This is one of the downsides of the widespread adoption of HDMI, it has quite a few downsides. Something like display port would be better, but it's far less common. Such is life.

[–] beefcat@beehaw.org 5 points 1 year ago* (last edited 1 year ago)

How is this a downside of HDMI?

It sounds to me like the user's TV or streaming box are configured incorrectly. DisplayPort doesn't magically remove judder from 24fps content being rendered to a 60hz signal.

DisplayPort never saw widespread adoption in the home theater space because it never tried to. The standard is missing a ton of features that are critical to complex home theater setups but largely useless in a computer/monitor setup. They aren't competing standards, they are built for different applications and their featuresets reflect that.

[–] GenderNeutralBro@lemmy.sdf.org 3 points 1 year ago (1 children)

Newer revisions of HDMI are perfectly good, I think. I was surprised and dismayed by how slow adoption was. I saw so many devices with only HDMI 1.4 support for years after HDMI 2.0 and 2.1 were in production (probably still to this day, even). It's the biggest problem I have with my current display, which I bought in 2019.

[–] beefcat@beehaw.org 2 points 1 year ago* (last edited 1 year ago) (1 children)

GP's problem probably isn't even bandwidth, but rather needs to enable their TV's de-judder feature or configure their streaming box to set the refresh rate to match that of the content being played.

[–] GenderNeutralBro@lemmy.sdf.org 2 points 1 year ago (1 children)

VRR support came with HDMI 2.1.

You could still have your player device set to a static 24 or 30 without VRR, in theory, but none of the devices I tried (granted, this was ~8 years ago) supported that anyway.

[–] beefcat@beehaw.org 4 points 1 year ago* (last edited 1 year ago)

VRR is really meant for video games.

You could still have your player device set to a static 24 or 30 without VRR, in theory, but none of the devices I tried (granted, this was ~8 years ago) supported that anyway.

That's interesting. Pretty much every Blu-Ray player should support this. I can confirm from firsthand experience that Apple TV, Roku, and Android TV devices also all support this. I can't speak for Amazon's fire stick thingy though.

The feature you are looking for is not to manually set the refresh rate, but instead for the device to set it automatically based on the framerate of the content being displayed. On Apple TV it’s called “match frame rate”.