I did fake Bayesian math with some plausible numbers, and found that if I started out believing there was a 20% per decade chance of a lab leak pandemic, then if COVID was proven to be a lab leak, I should update to 27.5%, and if COVID was proven not to be a lab leak, I should stay around 19-20%

This is so confusing: why bother doing “fake” math? How does he justify these numbers? Let’s look at the footnote:

Assume that before COVID, you were considering two theories:

  1. Lab Leaks Common: There is a 33% chance of a lab-leak-caused pandemic per decade.
  2. Lab Leaks Rare: There is a 10% chance of a lab-leak-caused pandemic per decade.

And suppose before COVID you were 50-50 about which of these were true. If your first decade of observations includes a lab-leak-caused pandemic, you should update your probability over theories to 76-24, which changes your overall probability of pandemic per decade from 21% to 27.5%.

Oh, he doesn’t, he just made the numbers up! “I don’t have actual evidence to support my claims, so I’ll just make up data and call myself a ‘good Bayesian’ to look smart.” Seriously, how could a reasonable person have been expected to be concerned about lab leaks before COVID? It simply wasn’t something in the public consciousness. This looks like some serious hindsight bias to me.

I don’t entirely accept this argument - I think whether or not it was a lab leak matters in order to convince stupid people, who don’t know how to use probabilities and don’t believe anything can go wrong until it’s gone wrong before. But in a world without stupid people, no, it wouldn’t matter.

Ah, no need to make the numbers make sense, because stupid people wouldn’t understand the argument anyway. Quite literally: “To be fair, you have to have a really high IQ to understand my shitty blog posts. The Bayesian math is is extremely subtle…” And, convince stupid people of what, exactly? He doesn’t say, so what was the point of all the fake probabilities? What a prick.

18 points
*

Ah, if only the world wasn’t so full of “stupid people” updating their bayesians based off things they see on the news, because you should already be worried of and calculating your distributions for… inhales deeply terrorist nuclear attacks, mass shootings, lab leaks, famine, natural disasters, murder, sexual harassment, conmen, decay of society, copyright, taxes, spitting into the wind, your genealogy results, comets hitting the earth, UFOs, politics of any and every kind, and tripping on your shoe laces.

What… insight did any of this provide? Seriously. Analytical statistics is a mathematically consistent means of being technically not wrong, while using a lot of words, in order to disagree on feelings, and yet saying nothing.

Risk management is not a statistical question in fact. It’s an economics question of your opportunities. It’s why prepping is better seen as a hobby, a coping mechanism and not as viable means of surviving apocalypse. It’s why even when a EA uses their super powers of bayesian rationality the answer in the magic eight ball is always just “try to make money, stupid”.

permalink
report
reply
18 points
*

Hi, my name is Scott Alexander and here’s why it’s bad rationalism to think that widespread EA wrongdoing should reflect poorly on EA.

The assertion that having semi-frequent sexual harassment incidents go public is actually an indication of health for a movement since it’s evidence that there’s no systemic coverup going on and besides everyone’s doing it is uh quite something.

But surely of 1,000 sexual harassment incidents, the movement will fumble at least one of them (and often the fact that you hear about it at all means the movement is fumbling it less than other movements that would keep it quiet). You’re not going to convince me I should update much on one (or two, or maybe even three) harassment incidents, especially when it’s so easy to choose which communities’ dirty laundry to signal boost when every community has a thousand harassers in it.

permalink
report
reply
17 points

ahh, I fucking haaaate this line of reasoning. Basically saying “If we’re no worse than average, therefore there’s no problem”, followed by some discussion of “base rates” of harrassment or whatever.

Except that the average rate of harrassment and abuse, in pretty much every large group, is unacceptably high unless you take active steps to prevent it. You know what’s not a good way to prevent it? Downplaying reports of harrassment and calling the people bringing attention to it biased liars, and explicitly trying to avoid kicking out harmful characters.

Nothing like a so-called “effective altruist” crowing about having a C- passing grade on the sexual harrassment test.

permalink
report
parent
reply
15 points

Scott: “Hmm, the reputation of the EA community that I am part of and love for some reason is tanking, due to the bad actions of its luminaries. What can I do to help? I know, I’ll bring up 9/11”

Empty room: “…”

“And I’ll throw out some made up statistics about terrorist attacks and how statistically we were due for a 9/11 and we overreacted by having any response whatsoever. And then I’ll show how that’s the same as when someone big in EA does something bad.”

“…”

“Especially since it’s common for people to, after a big scandal, try and push their agenda to improve things. We definitely don’t want that.”

“…”

“Also, on average there’s less SA in STEM, and even though there is still plenty of SA, we don’t need to change anything, because averages.”

“…”

“Anyway, time for dexy no. 5”

permalink
report
parent
reply
10 points

Hmm, the reputation of the EA community that I am part of and love for some reason is tanking, due to the bad actions of its luminaries.

“And it would be clear I’m full of shit if I put this at the start of the article, so I’ll bury the lede behind a wall of text”

permalink
report
parent
reply
15 points

and often the fact that you hear about it at all means the movement is fumbling it less than other movements that would keep it quiet

I just can’t get over how far this is from reality. like fuck, for a lot of these things the controversy is the community covering for the abuser, or evidence coming out that sexual harassment was covered up in the past. depressingly often in tech, the community doesn’t even try to keep it quiet; instead they just loudly endorse the abuser or talk about how there’s nothing they can do.

permalink
report
parent
reply
17 points
*

Scott is saying essentially that “one data point doesn’t influence the data as a whole that much” (usually true)… “so therefore you don’t need to change your opinions when something happens” which is just so profoundly stupid. Just so wrong on so many levels. It’s not even correct Bayesianism!

(if it happens twice in a row, yeah, that’s weird, I would update some stuff)

??? Motherfucker have you heard of the paradox of the heap? What about all that other shit you just said?

What is this really about, Scott???

Do I sound defensive about this? I’m not. This next one is defensive. [line break] I’m part of the effective altruist movement.

OH ok. I see now. I mean I’ve always seen, really, that you and your friends work really hard to come up with ad hoc mental models to excuse every bit of wrongdoing that pops up in any of the communities you’re in.

You definitely don’t get this virtue by updating maximally hard in response to a single case of things going wrong. […] The solution is not to update much on single events, even if those events are really big deals.

Again, this isn’t correct Bayesian updating. The formula is the formula. Biasing against recency is not in it. And that’s just within Bayesian reasoning!

In a perfect world, people would predict distributions beforehand, update a few percent on a dramatic event, but otherwise continue pursuing the policy they had agreed upon long before.

YEAH BECAUSE IT’S A PERFECT WORLD YOU DINGUS.

permalink
report
reply
16 points

Complete sidenote, but I hate how effective altruism has gone from “charities should spend more money on their charity and not on executive bonusses, here are the ones that don’t actually help anyone” to “I believe I will save infinity humans by colonizing mars, so you can just starve to death today”.

permalink
report
parent
reply
13 points

I suspect a large portion of people in EA leadership were already on the latter train and posturing as the former. The former is actually kinda problematic in its own way! If a problem was solvable purely by throwing money at it, then what is the need for a charity at all?

permalink
report
parent
reply
6 points

@swlabr @Tar_alcaran

Well, because (most) governments (mostly) *don’t* throw money at the problems that *could* be solved by throwing money at them.

Look at the malaria prevention or guinea worm eradication programs, for instance. Ten years ago or so, my first encounter with EA was a website talking about how many lives you could save or improve by giving money to NGOs focused on those issues.

Hell, look at homelessness in most “Western” countries, except Finland. Look at UBI. etc.

permalink
report
parent
reply
13 points
*

OK my knowledge of Bayes is rusty at best, but isn’t the idea that the occurrences should be relatively common, and/or not correlated?

So far, there has been zero or one[1] lab leak that led to a world-wide pandemic. Before COVID, I doubt anyone was even thinking about the probabilities of a lab leak leading to a worldwide pandemic.

Also, ideally, if there was a lab leak, then people running labs would take note and ensure that that particular failure mode doesn’t happen again. Thus the probability of an occurrence would be less than the first time it happened, because people actually take note of what has happened and change stuff.

Scottyboy could have used something that has occurred multiple times, like a nuclear powerplant accident, but his audience loves nuclear power, so that’s a non-starter. Also it’s a given that the mainstream press is the big bad in the fight against nuclear, just because serious accidents with widespread death and economic destruction happen again and again with nuclear power.

Raising the lab leak “hypothesis” is just signalling to his base.


[1] depending on where you stand in current US politics

permalink
report
reply
13 points

So far, there has been zero or one[1] lab leak that led to a world-wide pandemic. Before COVID, I doubt anyone was even thinking about the probabilities of a lab leak leading to a worldwide pandemic.

So, actually, many people were thinking about lab leaks, and the potential of a worldwide pandemic, despite Scott’s suggestion that stupid people weren’t. For years now, bioengineering has been concerned with accidental lab leaks because the understanding that risk existed was widespread.

But the reality is that guessing at probabilities of this sort of thing still doesn’t change anything. It’s up to labs to pursue safety protocols, which happens at the economic edge of of the opportunity vs the material and mental cost of being diligent. Reality is that lab leaks may not change probabilities, but yes the events of them occurring does cause trauma which acts, not as some bayesian correction, but an emotional correction so that people’s motivations for atleast paying more attention increases for a short while.

Other than that, the greatest rationalist on earth can’t do anything with their statistics about label leaks.

This is the best paradox. Not only is Scott wrong to suggest people shouldn’t be concerned about major events (the traumatic update to individual’s memory IS valuable), but he’s wrong to suggest that anything he or anyone does after updating their probabilities could possibly help them prepare meaningfully.

He’s the most hilarious kind of wrong.

permalink
report
parent
reply
8 points
*

If I could sum up everything that’s wrong with EA, it’d be,

“We can use statistics to do better than emotions!” in reality means “We are dysregulated and we aren’t going to do anything about it!!!”

permalink
report
parent
reply
13 points

Also, if you think either of these are true:

Lab Leaks Common: There is a 33% chance of a lab-leak-caused pandemic per decade. Lab Leaks Rare: There is a 10% chance of a lab-leak-caused pandemic per decade.

You should probably be campaigning to increase safety or shut down the labs you think would be responsible. 10% risk of pandemic per decade due to lab leaks (so in addition to viruses mutating on their own) isn’t rare or an acceptable risk.

permalink
report
parent
reply
9 points

Thanks for bringing up the dogwhistles. We haven’t talked about the dog whistles enough here. My fave has gotta be him bringing up the school shooting one.

permalink
report
parent
reply
9 points

Before COVID, I doubt anyone was even thinking about the probabilities of a lab leak leading to a worldwide pandemic.

This makes me wonder, we know the Rationalists did worry about a global pandemic before COVID19, we checked the waste water and the smug particles increased exponentially for a short time in feb 2020. But did they also worry about a normal lab leak like which might have happened here? Or was it all either nature/terrorism/AGI stuff?

permalink
report
parent
reply
10 points

For a while there, when it looked as if only the rich were gonna be able to source R95 masks and everyone else was gonna die, the SV elite were all aboard with this being the new Black Death. As soon as it became apparent that the only way to deal with it was through massive government support they did a 180 and started talking about how it wasn’t that bad after all.

permalink
report
parent
reply
8 points

I’m myself just annoyed that the other Scott (no not the cartoonist, the other smart Scott) blamed sneerers for covid being worse. (while sneerclub itself was agreeing with the Rationalists that people should be careful and that it wasn’t a non-event). And that this Scott above argued that people should stop smoking to help against covid (not that he had any proof for that, he just disliked that people smoke (yes as good Bayesians we should now increase our ‘is the Rationalist thought leader lying to me’ priors)). The rest I don’t really recall that much.

permalink
report
parent
reply
6 points

“priors updated” was the same desired outcome all along.

permalink
report
parent
reply
7 points

smug particles

Goddamn rats and their smug particles! (jk)

permalink
report
parent
reply
4 points

Anyone remember the South Park episode with the Prius emitting clouds of Smug?

permalink
report
parent
reply
13 points

Pay no attention to the man behind that curtain. The rate of men behind curtains is actually quite low. Do not doubt the great and powerful Oz.

permalink
report
reply

SneerClub

!sneerclub@awful.systems

Create post

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it’s amusing debate.

[Especially don’t debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

Community stats

  • 201

    Monthly active users

  • 335

    Posts

  • 7.9K

    Comments