this post was submitted on 08 Jan 2025
37 points (91.1% liked)

Out of the loop

11157 readers
79 users here now

A community that helps people stay up to date with things going on.

founded 2 years ago
MODERATORS
 

I saw a meme about something of fake frames, but i don't know what happened.

you are viewing a single comment's thread
view the rest of the comments
[–] givesomefucks@lemmy.world 25 points 1 day ago* (last edited 1 day ago) (4 children)

Fake frames is "frame generation" for Nvidia it's called ~~DLLS.~~

Rather than having the graphics card create 120 frames, you can crank the settings up to where you only get 60, then AI "guesses" what the next frame would show doubling it to 120 but keeping the higher settings.

This can make things blurry because the AI may guess wrong. So every odd frame is real, every even frame is just a guess.

Frame 1: real

Frame 2: guess

Frame 3: real

If the guess for #2 is accurate, everything is cool, if #2 guessed a target moves left when it moved right then #3 corrects and that "blink" is the problem.

The bigger issue is developers relying on that tech so they don't have to optimize code. So rather than DLSS being an extra ompf, it's going to be required for "acceptable" performance

[–] Coelacanth@feddit.nu 24 points 1 day ago (2 children)

Not to be nitpicky but DLSS is a different technology than frame generation, though it also involves AI guessing - just in a different way. DLSS (Deep Learning Super Sampling) means rendering the game at a lower resolution than your screen's output, then having it upscaled to the correct resolution via AI. This is much more performance friendly than native rendering and can often lead to a better looking visual end product than turning graphics features off and rendering natively - though it will depend on the game, genre and personal preference.

Frame generation is as you described. Worth noting is that DLSS without frame generation doesn't suffer issues like artifacts and input lag in the same manner as FG turned on. Frame generation also works better the higher your base frame rate is, so it's a bit of a "win-more". Using FG to go from 30 to 60 FPS will feel much worse than using it to go from 60 to 120.

The fake frames memes I believe stem from the updated frame generation technology in the 50 series guessing three frames at a time instead of one. So in effect you'll end up with a majority of the frames you see being "fake".

[–] nekusoul@lemmy.nekusoul.de 6 points 1 day ago* (last edited 1 day ago)

On the other hand, NVIDIA has started to consolidate all these technologies as the NVIDIA DLSS suite a few months ago for some reason.

So it's DLSS Super Resolution and DLSS Frame Generation, DLSS Ray Reconstruction and so on, with the exception of DLAA. Probably because that would get too stupid even for them.

[–] givesomefucks@lemmy.world 1 points 1 day ago* (last edited 1 day ago)

DLSS is a different technology than frame generation

Thanks! Got them mixed up

2 fake frames instead of just one I hadn't heard about either. I already leave it off on a 4070 super because 1:1 is already bad enough

[–] stankmut@lemmy.world 17 points 1 day ago (1 children)

To add on to this, the 5000 series now generates 3 fake frames per real frame instead of just 1.

[–] NewNewAccount@lemmy.world 1 points 1 day ago (2 children)

Is “fake” being used as a pejorative here?

[–] stankmut@lemmy.world 3 points 1 day ago (1 children)

I was just using the term that the previous commenter used to keep terms consistent.

[–] NewNewAccount@lemmy.world 2 points 1 day ago

Yeah not sure if there’s a better word to use without coming across as pedantic.

Fake certainly implies these are worse (which they of course are), but I’m not sure if they’re that much worse. I think in many scenarios the proverbial juice would absolutely be worth the squeeze, but naysayers seem to disagree with that sentiment.

[–] ch00f@lemmy.world 8 points 1 day ago (2 children)

Can someone explain how AI can generate a frame faster than the conventional method?

[–] MrPoopbutt@lemmy.world 4 points 1 day ago

It is image processing with statiatics rather than traditional rendering. It is a completely separate process. Also, NVidia GPUs (and the new upcoming AMD ones too) also have hardware built into the chip specifically for this.

[–] RubberElectrons@lemmy.world -1 points 1 day ago (1 children)
[–] ch00f@lemmy.world 3 points 1 day ago (1 children)

Which part? I mean even if it isn't generating the frames well, it's still doing the work. So that capability is there. What's the grift?

[–] RubberElectrons@lemmy.world 2 points 1 day ago* (last edited 1 day ago)

That it's reliable. The key point they're selling is that devs don't need to optimize their engines as much, of course obfuscated under a lot of other value-adds.

I'd go further than this and say part of our problems are generally that optimization of code isn't a focus anymore. Apps which merely interface with web APIs are more than 90mb sometimes. That's embarrassing.

That an AI can step in as savior for poor coding practices, is really a bandage stuck on the root cause.

[–] Dequei@sopuli.xyz 3 points 1 day ago

I see, thank you