when it comes to africa for example i was taught that they’re poor cause their land isn’t harvestable or some bullshit like that, and then i find out as an adult it’s actually cause western countries fucked the shit out of them and huh that makes a lot more sense
I was never taught WHY anyone was poor. It seemed like most education just focused around the fact that there just WERE poor people and it was something you always had to accept… Hmm wonder why that is…___
In high school and college I was basically taught that colonialism did most of it. However, it wasn’t until I took a class on globalization in college that I learned about the World Bank, the IMF, and all that absolutely fucking insidious shit. I had already considered myself a leftist at that point but I knew that if I hadn’t, that class would’ve radicalized me not even half way through the course, because that’s where I finally really began to understand neo-imperialism/neo-colonialism.
Me as a child in the shit country I live:
- “Dad, why are we poor and not like the cool countries in Europe”
- “Eeerhm… eeeh… dark skinned people don’t work and vote peronists”
Then I learned history and shit.
i kinda put together that a lot of african poverty is because of literal occupation, but i thought it was only in the past. I only recently learned that some west african countries are still paying “debts” to france for “damages” incurred during the french empire. its super bad
tbf it’s an easy lie that the land isn’t arable when the top third of the continent is the sahara desert