It’s conservationist, reducing hardware requirements to lengthen the lifetime of old hardware.
less on general software but more in the gaming side, why target the igpu then. although its common, even something near a decade old would be an instant uplift gaming performance wise. the ones that typically run into performamce problems mostly are laptop users, the industry that is the most wasteful with old hardware as unless you own a laptop like a framework, the user constantly replaces the entire device.
I for one always behind lengthening the lifetime of old hardware (hell i just replaced a decade old laptop recently) but there is an extent of explectations to have. e.g dont expect to be catered igpu wise if you willingly picked a pre tiger lake igpu. the user intentionally picked the worse graphics hardware, and catering the market to bad decisions is a bad move.
I, for one, hate the way PC gamer culture has normalized hardware obsolescence. Your hobby is just for fun, you don’t really need to gobble up so much power and rare Earth minerals and ever-thinner wafers all to just throw away parts every six months.
I have plenty of fun playing ascii roguelikes and I do not respect high performance gaming. It’s a conservationist nightmare.
whose throwing away stuff every six months, hardware cycles arent even remotely that short, hell, moores law was never that short in the existence of said law. and its not like I dont have my fair share of preventing hardware waste (my litteral job is the refurbishing and resell of computer hardware, im legitimately doing more than the averge person and trying to maintain older hardware several fold). But its not my job to dictate what is fun and whats not. whats fun for you isnt exactly everyone elses definition of fun.