Hey lemmings!

I wanted to share a quick update about our recent performance issues and how I have addressed them.

The last 24h have been a bit rough for lemm.ee.

Last night, I spent some time debugging federation issues with lemmy.world. We managed to significantly improve the situation - lemmy.world content is now reaching lemm.ee with a very high success rate - but this has had the effect of increasing incoming federation traffic on our servers significantly.

Additionally, we have been seeing steadily increasing normal user traffic over the past week, which is awesome from a community standpoint, but of course means that our servers have to do more work to keep up with all the new people.

To top things off, today there appeared a badly configured instance in the network, which was effectively launching a DoS attack against lemm.ee for several hours. Most likely it was unintentional, but unfortunately the end result was a sudden increase in our server load.

All these factors combined resulted in a really bad experience for most lemm.ee users today. Page load times have consistently been spiking into as much as 10 seconds or more for the whole day:

In fact, a lot of page loads just timed out with errors.

Fortunately, it seems I have managed to clear up the problems!

I have put a bunch of mitigations in place, and after monitoring the situation for the past hour, it seems that our performance issues have been resolved for now. So hopefully, you can enjoy browsing lemm.ee again without it feeling like torture!

Here are specific steps I took:

  • I have doubled the hardware resources for our backend servers and database.
  • I purchased a Cloudflare pro subscription for lemm.ee for 1 year. This took out a considerable chunk of my budget for lemm.ee, but in return it will allow me to analyze and optimize our cache usage to a far greater extent. I am already seeing vastly reduced load times for cacheable content (try opening https://lemm.ee a few times in a row as a logged out user - it should be blazing fast now!)
  • I have configured a rate limiter which will prevent future DoS from the specific method that was used against us today.

Of course, all of the above is costly. Luckily, lemm.ee users have been very generous with donations in the month of June, and in fact a significant amount of donors have opted for monthly recurring contributions. This all gives me the confidence to increase our spending for now, and I am currently expecting to NOT increase my personal planned contribution of 150€/month, as the increased costs so far are entirely being covered by donations!

Let me take this opportunity to thank the sponsors who made the upgrades possible! All lemm.ee users are now enjoying better performance thanks to you, I could not have done it without you awesome people.

On a final note, I just want to say that I hope a lot of these issues can be solved by optimizations in Lemmy software itself in the future. I have been personally contributing several optimizations to the Lemmy codebase, and I know many others are focused on optimizations as well. Just throwing extra resources at the problem will probably not be a sustainable solution for very long 😅. But I am optimistic that we are moving in the right direction with the software changes, and we’ll be enjoying reduced resource needs before long.

That’s all I wanted to share today, I wish you all a great weekend!

9 points

You broke the https://lemm.ee homepage, it returns a json.

permalink
report
reply
38 points

It does load that JSON very quickly!

permalink
report
parent
reply
16 points

Ah, yes, the optimist

permalink
report
parent
reply
9 points

And really, who doesn’t love json?

I mean protocol buffers might be more efficient, but json is nice and readable. Much nicer than XML for an example. And significantly more readable than protobuf!

permalink
report
parent
reply
12 points

It should be fixed now!

permalink
report
parent
reply
7 points

It is, only that I had to clear the cache :)

permalink
report
parent
reply
44 points

I just fucking love the transparency of the admins of lemmy.world and lemm.ee. Cheers guys!

permalink
report
reply
24 points

You’re welcome!

permalink
report
parent
reply
9 points

Thank you. I did notice when it got significantly faster.

permalink
report
reply
10 points

Wanted to share something from my experience running a pleroma instance: I was having an issue where postgresql was becoming more and more of my CPU utilization. It looked like I was going to have to buy a seriously upgraded server, my loads like 3-4 constantly.

I ran pg_repack during a lower traffic hour (site continued to run during the run but at reduced performance) and my loads were down by 90%, to much less than 1. Now I have it set to do a repack weekly (ymmv, it just seemed like a good frequency to me)

Haven’t done it to my Lemmy server yet, but that’s because of all my instances this one is the newest.

permalink
report
reply
8 points

Yeah the slowdown was a bit rough, been browsing off and on all day today. Thanks for fixing that. Seems to be working a lot better now. That’s a bummer you had to increase expenses though.

permalink
report
reply

Meta (lemm.ee)

!meta@lemm.ee

Create post

lemm.ee Meta

This is a community for discussion about this particular Lemmy instance.

News and updates about lemm.ee will be posted here, so if that’s something that interests you, make sure to subscribe!


Rules:

  • Support requests belong in !support
  • Only posts about topics directly related to lemm.ee are allowed
  • If you don’t have anything constructive to add, then do not post/comment here. Low effort memes, trolling, etc is not allowed.
  • If you are from another instance, you may participate in discussions, but remain respectful. Realize that your comments will inevitably be associated with your instance by many lemm.ee users.

If you’re a Discord user, you can also join our Discord server: https://discord.gg/XM9nZwUn9K

Discord is only a back-up channel, !meta@lemm.ee will always be the main place for lemm.ee communications.


If you need help with anything, please post in !support instead.

Community stats

  • 821

    Monthly active users

  • 261

    Posts

  • 7.4K

    Comments

Community moderators