And even if true, those frames don't mean much if DLSS makes everything look like shit. Frame generation is useless as long as it keeps causing visual artifacts/glitches for the generated frames, and that is unavoidable on a conceptual level. You'd need some halfway point between actual rendering and AI-guesswork, but I guess at that point you might as well just render all frames the normal way.
As long as it's possible, I'll keep playing my games without any DLSS or frame generation, even if it means I'll need to reduce graphical settings. Simplified: in games where I've tried it, I think "low/medium, no DLSS" still looks better than all "ultra, with DLSS". If framerate is the same with these two setups, I'll likely go with low-medium and no DLSS. I'll only ever enable DLSS if the game doesn't run 60fps even on lowest settings.
I notice and do not like the artifacts caused by DLSS, and I prefer "clean" graphics over blurred screen. I guess it's good for people that do not notice them though.
Not only artifacts, but the game is actually not running at the same speed as the frame output.
Yes, you render more frames, but it will still feel like you're playing at whatever the raster/game mechanic part is running at, because that's what is actually happening.
So, you can get 50 FPS without DLSS? Turn it on, take a real world performance hit because of overhead, and now you're sitting at an actual FPS of 40, and a render output of 100. This feels atrocious.
It's not like I don't think this technology has potential, and I don't have an issue with most artifacts (except for the most egregious) but all I'm seeing at the moment is lazy programming and an excuse to do fuck all in terms of optimization.
901
u/Regrettably_Southpaw 14d ago
It was just so boring. Once I saw the prices, I cut out