But … where is the innovation (and also Alt text?)
Image description.
The image is a screenshot of a tumblr post by user elbiotipo.
My solution for bloatware is this: by law you should hire in every programming team someone who is Like, A Guy who has a crappy laptop with 4GB and an integrated graphics card, no scratch that, 2 GB of RAM, and a rural internet connection. And every time someone in your team proposes to add shit like NPCs with visible pores or ray tracing or all the bloatware that Windows, Adobe, etc. are doing now, they have to come back and try your project in the Guy’s laptop and answer to him. He is allowed to insult you and humilliate you if it doesn’t work in his laptop, and you should by law apologize and optimize it for him. If you try to put any kind of DRM or permanent internet connection, he is legally allowed to shoot you.
With about 5 or 10 years of that, we will fix the world.
Innovation is orthogonal to code size. None of the software most modern computers are running cannot be solved on 10 year old computers. It’s just the question whether the team creating your software is plugging together gigantic pieces of bloatware or whether they actually develop a solution to a real problem.
Planned obsolescence is one of the major engines that keep our current system of oligarchic hypercapitalism alive. Won’t anybody think of the poor oligarchs?!?
Resources are just way cheaper than developers.
It’s a lot cheaper to have double the ram than it is to pay for someone to optimize your code.
And if you’re working with code that requires that serious of resource optimization you’ll invariably end up with low level code libraries that are hard to maintain.
… But fuck the Always on internet connection and DRM for sure.
If you consider only the RAM on the developers’ PCs maybe. If you count in thousands of customer PCs then optimizing the code outperforms hardware upgrades pretty fast. If because of a new Windows feature millions have to buy new hardware that’s pretty desastrous from a sustainability point of view.
Last time I checked - your personal computer wasn’t a company cost.
Until it is nothing changes - and to be totally frank the last thing I want is to be on a corporate machine at home.
When I was last looking for a fully remote job, a lot of companies gave you a “technology allowance” every few years where they give you money to buy a computer/laptop. You could buy whatever you wanted but you had that fixed allowance. The computer belonged to you and you connected to their virtual desktops for work.
Honestly, I see more companies going in this direction. My work laptop has an i7 and 16GB of RAM. All I do is use Chrome.
Or maybe you could actually read the comment you are replying to instead of being so confrontational? They are literally making the same point you are making, except somehow you sound dismissive, like we just need to take it.
In case you missed it they were literally saying that the fact that the real cost of running software (like the AI recall bullshit) is externalized to consumers makes companies don’t give a shit about fixing this. Like literally the same you are saying. And this means that we all, as a society, are just wasting a fuck ton of resources. But capitalism is so eficient hahaha.
But come on man, you really think that the only option is for us to run corporate machines in our homes? I don’t know if I should feel sorry about your lack of imagination, or if you are trying to strawman us here. I’m going to assume lack of imagination, don’t assume malice and all that.
For example, that’s what simple legislation could do. For example, lets say I buy an cellphone/computer, then buy an app/program for that device, and the device has the required specifications to run the software. The company that sold me that software should be obligated by law to give me a version of the software that runs in my machine forever. This is not a lot to ask for, this is literally how software worked before the internet.
But now, behind the cover of security and convenience, this is all out of the window. Each new windows/macos/ios/android/adobe/fucking anything update asks for more and more hardware and little to no meaningful new functionality. So we need to keep upgrading and upgrading, and spending and spending.
But this is not a given, we can do better with very little sacrifices.
You can also build a chair out of shitty plywood that falls apart when someone who weighs a bit more sits on it, instead of quality cut wood. I mean, fine if you want to make a bad product but then you’re making a bad product.
Resource optimization has nothing to do with product quality. Really good experiences can be done with shitty resource consumption. Really bad experiences can be blisteringly fast in optimization.
The reason programmers work in increasingly abstract languages is to do more with less effort at the cost of less efficient resource utilization.
Rollercoaster Tycoon was ASM. Slay the Spire was Java. They’re both excellent games.
It’s a lot cheaper to have double the ram
yeah a lot cheaper to force someone else to buy double the RAM. No thanks.
Companies own the code you write.
It’s not your code if you’re working for a corp - it’s theirs.
I 100% agree. But, where Linux?
This is like the definition of a “conservative”. Progress shouldn’t happen because their not ready for it. They are comfortable with what they use and are upset that other people are moving ahead with new things. New things shouldn’t be allowed.
Most games have the ability to downscale so that people like this can still play. We don’t stop all progress just because some people aren’t comfortable with it. You learn to adjust or catch up.
It’s not really about comfort when you buy software and it doesn’t work unless you also buy an $800 hardware upgrade. Especially when it worked fine on the previous version and the only difference is the addition of extraneous features.
I think the examples given are just poorly chosen. When it comes to regular applications and DRM, then yes, that’s ridiculous.
On the other hand, when it comes to gaming, then yes, give me all the raytracing and visible pores on NPCs. Most modern games also scale down well enough that it’s not a problem to have those features.
The topic is bloatware, not games. Very different. When it comes to gaming, the hardware costs are a given (for the sake of innovation, as you put it); but when it comes to something fundamental to your computer—think of the window manager or even the operating system itself—bloat is like poison in the hardware’s veins. It is not innovation. It is simply a waste of precious resources.
The topic is bloatware, not games.
The original post includes two gaming examples, so it’s actually about both, which is a bit unfortunate, because as you’ve said, they’re two very different things.
It’s the opposite. Limitations foster creativity. Those old computers and game consoles could do amazing things when people wanted to do something. Now you didn’t have to think about what you’re doing, just expect the user to have high end equipment and a super high speed Internet connection. It’s the equivalent to saying you need a trophy truck in order to go over the road you just built because it’s too shitty for a regular car to drive on.
Somebody didn’t live though the “Morrowind on Xbox” era where “creativity” meant intentionally freezing the loading screen and rebooting your system in order to save a few KB of RAM so the cell would load.
But also having no automatic corpse cleanup, so the game would eventually become unplayable as entities died outside of your playable area, so you couldn’t remove them from the game, creating huge bloat in your save file.
Not all creativity is good creativity.
“Limitations foster creativity.”
100% agree. But there’s no reason to limit innovation because some people can’t take advantage of it. Just like we shouldn’t force people to have to consistently upgrade just to have access to something, however there should be a limit to this. 20 years of tech changes is huge. You could get 2 Gb of Ram in a computer on most home computers back in the early-mid 2000’s…that’s two decades ago.
I’m still gaming on my desktop that I built 10 years ago quite comfortably.
It’s conservationist, reducing hardware requirements to lengthen the lifetime of old hardware.
less on general software but more in the gaming side, why target the igpu then. although its common, even something near a decade old would be an instant uplift gaming performance wise. the ones that typically run into performamce problems mostly are laptop users, the industry that is the most wasteful with old hardware as unless you own a laptop like a framework, the user constantly replaces the entire device.
I for one always behind lengthening the lifetime of old hardware (hell i just replaced a decade old laptop recently) but there is an extent of explectations to have. e.g dont expect to be catered igpu wise if you willingly picked a pre tiger lake igpu. the user intentionally picked the worse graphics hardware, and catering the market to bad decisions is a bad move.
I, for one, hate the way PC gamer culture has normalized hardware obsolescence. Your hobby is just for fun, you don’t really need to gobble up so much power and rare Earth minerals and ever-thinner wafers all to just throw away parts every six months.
I have plenty of fun playing ascii roguelikes and I do not respect high performance gaming. It’s a conservationist nightmare.