I think most people that were gaming held onto their CRTs as long as possible. The main reason being, the first generation of LCD panels took the analogue RGB input, and had to present that onto the digital panel. They were generally ONLY 60hz, and you often had to reset their settings when you changed resolution. Even then, the picture was generally worse than a comparable, good quality CRT.
People upgraded mainly because of the reduced space usage and that they looked aesthetically better. Where I worked, we only had an LCD panel on the reception desk, for example. Everyone else kept using CRTs for some years.
CRTs on the other hand often had much better refresh rates available, especially at lower resolutions. This is why it was very common for competitive FPS players to use resolutions like 800x600 when their monitor supported up to 1280x960 or similar. The 800x600 resolution would often allow 120 or 150hz refresh.
When LCD screens with a fully digital interface became common, even though they were pretty much all 60hz locked, they started to offer higher resolutions and in general comparable or better picture quality in a smaller form factor. So people moved over to the LCD screens.
Fast-forward to today, and now we have LCD (LED/OLED/Whatever) screens that are capable of 120/144/240/360/Whatever refresh rates. And all the age-old discussions about our eyes/brain not being able to use more than x refresh rate have resurfaced.
It’s all just a little bit of history repeating.
I had a 20-odd inch CRT with the flat tube. Best CRT I ever had, last one I had before going to LCD. Still miss that thing, the picture was great! Weighed a ton, though.
I thought that dude is a woman with hair ties up
It’s ok, if anyone wants them back the smash brothers melee community has them all in the back of their car
I like how no one mentions that CRT pixels bleed into each other.
Are you sure it was CRT technology? Because bear in mind, colour CRTs had to focus the beam so accurately that it only hit the specific “pixel” for the colour being lit at that time. What there was, was blur from bad focus settings, age and phosphor persistence (which is still a thing in LCD to an extent).
What DID cause blur was the act of merging the image, the colour and the synchronisation into a composite signal. All the mainstream systems (PAL, SECAM and NTSC) would cause a blurring effect. Games on 80s/90s consoles generally used this to their advantage, and you can see the dithering effects clearly on emulators of systems from that period. Very specifically, the colour signal sharing spectrum with the luminance signal would lead to a softening of the image which would appear like blurring. Most consoles from the time only output either an RF signal for a TV or if you were lucky a composite output.
Good computer monitors (not TVs) of the time were extremely crisp when fed a suitable RGB signal.
And it worked as AA wasn’t as important in that “fuzzier” screen when graphics aren’t as good as they are today.
Sure, and in fact some developers used the fuzziness to their advantage, which can make certain games look weird when you display them on anything modern. But, my point was more that some people are in here acting like every part of a CRT experience is better than flatscreens.