It could be a conspiracy, or it could just be that developers implement those features when they get money/help from nVidia and AMD, and don’t when they don’t. Or maybe AMD’s offering is easy to implement and nVidia’s isn’t. I vaguely recall that nVidia’s is tied to the game more tightly than AMD’s is, but don’t quote me on that.
In the end, I expect we’ll eventually converge on a system that both work with, and this’ll all just be a blip in history, like every other standard worth supporting.
Feels like GSync vs. FreeSync all over again. Now largely converged into HDMI Variable Refresh Rate (VRR) and DisplayPort Adaptive-Sync, and it’s a lot harder to find a VRR/Adaptive-Sync capable display that won’t do adaptive framerate at all with your graphics card.
Both Unreal Engine and Unity have temporal anti-aliasing upscaling (TAAU), though I believe they still lag behind DLSS in both visual quality and performance. That gap will narrow over time, and once it does I doubt we’ll see many game developers spend extra effort implementing DLSS or FSR if TAAU can get close enough for no extra effort on their part.
If you flip it this sounds ridiculous. Tons of games with FSR that are missing DLSS.