Why do people like this? It's worse than FSR3, yet gets shilled more.
>>101214695FSR3 FG that is
Because you can run it in every game. FSR 3.1 fucking shits on it but i cant use it on Elden Ring New Vegas etc. etc. you get my point
>>101214695>Why do people like this? It's worse than FSR3, yet gets shilled more.Like what? What am I looking at? "Duck Super Sampling?"
i'm not 1cm away from my screen bro and the smoothness is worth it
couldn't make it work on my modded skyrim properly, and I'm not even a retardJust paid the same price for a fsr mod of a random patreon jew
>>101214695Because I can use it on my potato PC with integrated graphics.
these baby duck threads are getting ridiculous
whats the main usage here? running a 60 FPS game at 144 FPS?
>>101214803>FSR 3.1 fucking shits on it but i cant use it on Elden Ring New Vegas etc. etc. you get my pointElden Ring supports FSR3 DLSS FG mod
>>101216260watching anime at 3x framerate to eliminate stuttering
>>101216260yeah
>>101216570We already had that years ago. Properly configured player with frame interpolation and generation will be better than using LS.
>>101216260Running a game locked at 60fps at 120fps
>>101216260I am using a cracked old version for rpgm games at 1080p
>>101214695Its there as a universal application, it sacrifices optimize performance of a native feature for universality. If your game doesn't require too much core usage, and you're GPU vram bound, then this helps.
>>101216942it's like it's done late in the pipeline, it lags so bad
>>101217164Bad implementation can cause lag for certain. If there's too much overhead introduced to upscale the frame, it kills the entire point of upscaling
>>101214695I use this to play some 240p games on a real CRT TV.>>101217164Pretty sure it IS done late in the pipeline, I believe this works by capturing finished frames (like OBS does for streaming), processes those, then displays them in its own full-screen window (not the game's window). So yeah, it's so late in the pipeline it's not in the pipeline at all, it's using fully finished frames and it's getting them when they'd normally be on screen.
>>101217288>I use this to play some 240p games on a real CRT TV.why?
>>101217164>>101217288This. It's not as well implemented as DLSS or FSR3 FG, that work in the pipeline, hence why the latency is worse too.
>>101217277>>101217164Just needs to be sped up by on GPU utilization of specialized cuda cores. CPU upscale implementation is probably the bottleneck here. Both nvidia and AMD does GPU scaling for a reason as they can split the screen, do multi process to thousands of their cores, reducing any latency involved
>>101217337Long story short, modern graphics cards and DP/HDMI to VGA adapters don't support resolutions as low as 320x240 so in order for it to work you need a higher res. CRT displays only really care about the number of lines you give them, they have no concept of pixels, so you can set your PC to something like 2560x240 or 1920x240 resolution and that signal will work both with the CRT TV and a modern graphics card too.Now if you want a 320x240 game to run in 2560x240 for instance, you need to scale it, as in you need to multiply pixels but only horizontally, while keeping the lines untouched. Since this is a weird as fuck mode, obviously, most things just don't support it, hence an external scaler. Retroarch supports CRT super-resolution output directly, but other things don't.
>>101217277>>101217400But we're talking about FG not upscaling?
>>101217450wouldn't it automatically scale if you use display aspect ratio correction in the drivers? also doesn't must things support stretch scaling (or are you talking about pc game ports like indie games that render at 240p to look like old games)?
>>101217475Same thing prob happens. GPU frame generation is done with thousands of cores. CPU is likely doing it single threaded or at best very weakly multi threaded with these softwares.Further more, GPU implementation done by nvidia/amd takes advantage of the fact that they have direct access to frame buffers so they dont need to do memory swaps between cpu and GPU. And even if you do it by GPU acceleration, you need to control for the memory swaps as that causes latencies.
>>101217501No, not all things do and the driver scaling shit is wonky and unreliable. Lots of games just run pillarboxed in the center, which on a TV just results in a super narrow and incomprehensible image in the center.>indie games that render at 240p Yeah, those are one reason. I've also used it for old DOS games which actually render at 320x200 though for those I had to use some other scaler since Lossless Scaling is too retarded to scale only horizontally and it would scale 200 lines to 240 too, which introduces shimmer.
i love frame generation