I read this quote today, and it resonated:

"The unborn” are a convenient group of people to advocate for. They never make demands of you; they are morally uncomplicated, unlike the incarcerated, addicted, or the chronically poor; they don’t resent your condescension or complain that you are not politically correct; unlike widows, they don’t ask you to question patriarchy; unlike orphans, they don’t need money, education, or childcare; unlike aliens, they don’t bring all that racial, cultural, and religious baggage that you dislike; they allow you to feel good about yourself without any work at creating or maintaining relationships; and when they are born, you can forget about them, because they cease to be unborn. You can love the unborn and advocate for them without substantially challenging your own wealth, power, or privilege, without re-imagining social structures, apologizing, or making reparations to anyone. They are, in short, the perfect people to love if you want to claim you love Jesus, but actually dislike people who breathe. Prisoners? Immigrants? The sick? The poor? Widows? Orphans? All the groups that are specifically mentioned in the Bible? They all get thrown under the bus for the unborn. - David Barbary, Methodist pastor

It certainly rings true for white American evangelicals, but it quickly occurred to me it applies pretty well to longtermists too. Centering the well-being of far-future simulated super-humans repulses me, but it seems very compelling to the majority of the EA cult.

You are viewing a single thread.
View all comments View context
2 points

The linked stats are already way out of date

Do you have a source for this ‘majority’ claim? I tried searching for more up to date data but this less comprehensive 2020 data is even more skewed towards Global development (62%) and animal welfare (27.3%) with 18.2% for long term and AI charities (which is not equivalent to simulated humans, because it also includes climate change, nearterm AI problems, pandemics etc). Utility of existential risk reduction is basically always based on population growth/ future generations (aka humans) and not simulations. ‘digital person’ only has 25 posts on the EA forum (by comparison, global health and development has 2097 post). It seems unlikely to me that this is a majority belief.

permalink
report
parent
reply
9 points

Short answer: “majority” is hyperbolic, sure. But it is an elite conviction espoused by leading lights like Nick Beckstead. You say the math is “basically always” based on flesh and blood humans but when the exception is the ur-texts of the philosophy, counting statistics may be insufficient. You can’t really get more inner sanctum than Beckstead.

Hell, even 80000 hours (an org meant to be a legible and appealing gateway to EA) has openly grappled with whether global health should be deprioritized in favor of so-called suffering-risks, exemplified by that episode of Black Mirror where Don Draper indefinitely tortures a digital clone of a woman into subjugation. I can’t find the original post, formerly linked to from their home page, but they do still link to this talk presenting that original scenario as a grave issue demanding present-day attention.

permalink
report
parent
reply
7 points

Calling it a majority might be unwarranted. EAs have bought a lot of mosquito nets, and most of those donations were probably not made with the thinking “can’t lift-and-shift this old brain of mine into the cloud if everyone dies of malaria”.

That said, the data presented on that page is incredibly noisy, with a very small sample size for the individual respondents who specified the cause they were donating to and numbers easy to skew with a few big donations. There’s also not much in there about the specific charities being donated to. For all I can tell they could just be spinning some AI bullshit as anything from public health to criminal justice reform. Speaking of which,

AI charities (which is not equivalent to simulated humans, because it also includes climate change, nearterm AI problems, pandemics etc)

AI is to climate change as indoor smoking is to fire safety, nearterm AI problems is an incredibly vague and broad category and I would need someone to explain to me why they believe AI has anything to do with pandemics. Any answer I can think of would reflect poorly on the one holding such belief.

permalink
report
parent
reply
2 points

the data presented on that page is incredibly noisy

Yes, that’s why I said it’s “less comprehensive” and why I first gave the better 2019 source which also points in the same direction. If there is a better source, or really any source, for the majority claim I would be interested in seeing it.

Speaking of which,

AI charities (which is not equivalent to simulated humans, because it also includes climate change, nearterm AI problems, pandemics etc)

AI is to climate change as indoor smoking is to fire safety, nearterm AI problems is an incredibly vague and broad category and I would need someone to explain to me why they believe AI has anything to do with pandemics. Any answer I can think of would reflect poorly on the one holding such belief.

You misread, it’s 18.2% for long term and AI charities [emphasis added]

permalink
report
parent
reply
6 points

18.2% is not a majority, but it’s 18.2% higher than it would be in a movement that didn’t have a serious fucking problem

permalink
report
parent
reply

SneerClub

!sneerclub@awful.systems

Create post

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it’s amusing debate.

[Especially don’t debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

Community stats

  • 201

    Monthly active users

  • 335

    Posts

  • 7.9K

    Comments