Is it not incredible how the same people who are talking about how the west is descending into fascism also think that the west has moral superiority over the rest of the world and has the right to impose their values on the global majority.
I always think about stuff like this. Westerners love to talk about how much they hate other people and how miserable everything is but will still turn around and preach to marginalized and oppressed people domestically and abroad that this is the best system and we are justified using war to “spread” it across the globe