The moment GMan’s mouth fucking moved in Half Life when he talked. 🤯
Anymore = ever again
Any more = any further
They’re two different things.
They don’t make games that look like that anymore, even though we thought the graphics couldn’t get any more realistic back then.
You know everyone was scanning your comment hard to see if you made any grammar mistakes.
@ThirdWorldOrder @otp Right. Lol
Bruh, english has thousands of words with multiple means and the same spelling. You understood the meaning.
“Oh no, they’re trying to teach me about grammar! I must resist knowledge!”
Tbf these games were made with crtvs in mind and crtvs blurred the edges making things look smoother. They only look so blocky nowadays because newer tvs have better resolution so you can clearly see all the blocky edges.
I have never in my life seen someone refer to CRT TVs as crtvs and it’s really fucking with my head lmao
It’s a habit I picked up from my dad lol
He always called them crtvs because he thought the “tube” part of cathode ray tube was unnecessary when using the acronym. You know it’s a tube because what else would a cathode ray be in?
I think it depends… Definitely with 2d games, but for instance, the heads and fists in Goldeneye (N64) always looked blocky
It is my opinion that we reached peak graphics 6 or 7 years ago when GTX1080 was king. Why?
- Games from that era look gorgeous (eg Shadow of Tomb Raider), yet were well optimized to run high/ultra at FHD on RX570.
- We didn’t need to rely on fakery like DLSS and frame generation to get playable frame rates. If anything, people used to supersample for the ultimate picture quality. Even upping the rendering scale to 1.25 made everything so crisp.
- MSAA and SMAA antialiasing look better, but somehow even TAA from that era doesn’t seem as blurry. Today, might as well use FXAA.
Graphics today seem ass-backward to me: render at 60…70% scale to have good framerates, FX are often rendered at even lower resolution, slap on overly blurry TAA to hide the jaggies, then use some upsample trickery to get to the native resolution. And it’s still blurry, so squirt some sharpening and noise on top to create an illusion of detail. And still runs like crap, so throw in frame interpolation to get the illusion of higher frame rate.
I think it’s high time we should be able to run non-raytracing graphics at 4k native and raytracing at 2.5k native on 500€ MSRP GPU-s with no trickery involved.
GPUs are getting better, but the demand from the crypto and ML AI markets mean they can just jack up the price of every new card to higher than the last so the prices have stopped dropping with each new generation.
- We didn’t need to rely on fakery like DLSS and frame generation to get playable frame rates.
If truly believe what you wrote, then you should never look into the details of how a game world is rendered. It’s fakery stacked upon fakery that somehow looks great. If anything, the current move of ray tracing with upscaling is less fakery than what was before.
There’s a saying in computer graphics: if it looks right, it is right. Meaning you shouldn’t worry if the technique makes a mockary of how light actually works as long as the viewer won’t notice.
Sure, all graphics is about creating an illusion.
But there’s a stark difference between optimization like culling, occlusion planes, LOD-s, half-res rendering of costly FX (like AO) and using a crutch like lowering the rendering resolution of the whole frame to try and make up for bad optimization or crap hardware. DLSS has it’s place for 150…200€ entry-level GPU-s trying to drive a 2.5k monitor, not 700€ “midrange” cards.
But there’s a stark difference between optimization like culling, occlusion planes, LOD-s, half-res rendering of costly FX (like AO)
and using a crutch like lowering the rendering resolution of the whole frame to try and make up for bad optimization or crap hardware.
There is not a stark difference if you were to describe the techniques objectively and not twist it to what you feel they’re like.
There are so many steps in the render pipeline where native resolution isn’t used. Yet I don’t here the crowd complaining about shadow map size or how reflections are half res. Upscaling is just another tool that allows us to create better looking frames at playable refresh rates. Compare Alan Wake or Avatar with DLSS with any other game without DLSS and they will still come out on top.
DLSS has it’s place for 150…200€ entry-level GPU-s trying to drive a 2.5k monitor, not 700€ “midrange” cards.
Just because you’re unhappy with Nvidia’s pricing strategy doesn’t mean you should slander new render techniques. You’re mixing two different topics.
I used to have a subscription to Game Informer magazine. I very specifically remember the multi page preview for the upcoming game, Oblivion. The pictures they had in there, I swear to God, were actually pictures of trees and grass. The fidelity was unparalleled and it was the peak of what games could do. Idk why that article sticks out so much, but it felt like the top of the mountain.
I had Quake running with software 3D, got a 3DFX board and patched Quake to run with hardware 3D and the results just blew my mind…
I remember upgrading to a voodoo 3dfx card around the time transparent water was possible in Quake. The graphics blew me away and the ability to see players in the water gave a ridiculous advantage.
Hah I get that but it was for half life 1 and I thought the graphics were amazing. Rainbow 6 rogue spear was my first PC game and I thought that was the pinacle of graphics… fuck I’m old.
If you want a real nostalgia kick go here:
https://youtu.be/ZiSTywA6fHY?si=6mGUbtMeqV4OG_vC
I remember the game disc doubled as a sound track of you put it in a CD player.
For me it was reading in Playstation Magazine that there were melting ice cubes in the then upcoming Metal Gear Solid 2. I’m not even sure PS2 had been released yet at the time, so I was just awe struck thinking wow it’s getting so powerful and detailed that even ice cubes in a sink are accounted for.
I can relate, but by the time Oblivion came out I was already starting to get jaded about graphical fidelity. What I can tell you is that I ogled over a similar preview for Morrowind, and actually built my first PC specifically targeting the recommended specs to run it in all its glorious glory!
Tale as old as time I suppose