How in the world does setting a bunch of subs to private crash the website?
High-scale software is complex, sometimes there are edge cases where weird unexpected stuff happens. This isn’t a situation they would normally run into.
It absolutely is something they would normally run into. I work on maintaining a massive application; think 60+ teams of 6, each extremely specialized and minimal overlap. Almost 75% of my job is predicting issues and avoiding them. Peer testing draws on this a ton as well. They just continue to plainly show that they don’t care. Time and time again, year after year, they continue to have the exact same issues and do fuck all about it.
Why would they normally run into 6000+ subs going private? I’m sure they tested that their code can generally handle some (usually smaller) subs going private, but the number and size of the subs going dark isn’t a normal scenario and I doubt anyone would have assumed such a successful and coordinated protest involving some of the biggest subs would even be possible a few months ago.
Someone on Tilde posted that they used to work for Reddit and the way they have the front page set up to pull your subscribed subreddits and the ones that you might like to read from is spaghetti code and very brittle.
I mean, it’s worked until now lol. Probably should have fixed it, but I can understand why the higher ups wouldn’t want to.
Reddit has had extremely spotty reliability forever. It got better in recent years, but still came down every few weeks, or would just randomly say “you broke reddit!”. Circa 2015 every evening it would just randomly return 50x errors a good chunk of the time because it was always overloaded.
Backend reliability mustn’t be very high up their priority list. Well, neither is UX (old OR new reddit), and let’s not pretend that they’ve been masterminds when it comes to ad placement either, so the real question is what do the higher ups want, and why can’t they achieve it?
honestly I figured it’d be the result of all those people running deletion scripts on their accounts
This is probably it. Also ArchiveTeam is archiving Reddit as a high priority target so lots of bots scraping it
God bless archive.org. We’d be so screwed without their efforts.
thats kinda hilarious
Just one thing, must be indefinitely not 48h, otherwise nothing will change. A year have 365 days, 48h it’s a weekend, if subreddits come back after 48h it’s nothing happened.
I think that stating it as 48 hours was smart. Because if subreddits start saying it’s indefinite, then they have time to start replacing mods and shutting the protest down. Whereas here, 48 hours will pass. They lose a lot of money in just 2 days. And if nothing changes, you’ll likely see decreased quality and/or continued protest.
I think it left room at the table for reddit to cooperate. It’s a common bargaining thing.
I saw a lot of people saying they used bots to rewrite their entire post history to things like “fuck /u/spez”
I’m thinking that amount of comment editing might be quite the spike on the servers.
I did that, but not “fuck u/spez” because that might get the comment deleted altogether making the edit meaningless. Instead I edited to say the comments been deleted in protest against the API changes.
How much spaghetti is your code if most of the communities switching to private fucks up your website?