Not a good look for Mastodon - what can be done to automate the removal of CSAM?

172 points
*

https://stacks.stanford.edu/file/druid:vb515nd6874/20230724-fediverse-csam-report.pdf

I’d suggest that anyone who cares about the issue take the time to read the actual report, not just drama-oriented news articles about it.

permalink
report
reply
60 points

So if I’m understanding right, based on their recommendations this will all be addressed as more moderation and QOL tools are introduced as we move further down the development roadmap?

permalink
report
parent
reply
-77 points

What development roadmap? You’re not a product manager and this isn’t a Silicon Valley startup.

permalink
report
parent
reply
120 points
*
Removed by mod
permalink
report
parent
reply
51 points

If I can try to summarize the main findings:

  1. Computer-generated (e.g…, Stable Diffusion) child porn is not criminalized in Japan, and so many Japanese Mastodon servers don’t remove it
  2. Porn involving real children is removed, but not immediately, as it depends on instance admins to catch it, and they have other things to do. Also, when an account is banned, the Mastodon server software is not sending out a “delete” for all of their posted material (which would signal other instances to delete it)

Problem #2 can hopefully be improved with better tooling. I don’t know what you do about problem #1, though.

permalink
report
parent
reply
28 points

One option would be to decide that the underlying point of removing real CSAM is to avoid victimizing real children; and that computer-generated images are no more relevant to this goal than Harry/Draco slash fiction is.

permalink
report
parent
reply
1 point

And are you able to offer any evidence to reassure us that simulated child pornography doesn’t increase the risk to real children as pedophiles become normalised to the content and escalate (you know, like what already routinely happens with regular pornography)?

Or are we just supposed to sacrifice children to your gut feeling?

permalink
report
parent
reply
11 points

Such a signal exists in the ActivityPub protocol, so I wonder why it’s not being used.

permalink
report
parent
reply
5 points

I don’t know what you do about problem #1, though.

Well the simple answer is that it doesn’t have to be illegal to remove it.

The legal question is a lot harder, considering AI image generation has reached levels that are almost indistinguishable from reality.

permalink
report
parent
reply
3 points
*

In which case, admins should err on the side of caution and remove something that might be illegal.

I personally would prefer to have nothing remotely close to CSAM, but as long as children aren’t being harmed in any conceivable way, I don’t think it would be illegal to post art containing children. But communities should absolutely manage things however they think is best for their community.

In other words, I don’t think #1 is a problem at all, imo things should only be illegal if there’s a clear victim.

permalink
report
parent
reply
28 points

4.1 Illustrated and Computer-Generated CSAM

Stopped reading.

Child abuse laws “exclude anime” for the same reason animal cruelty laws “exclude lettuce.” Drawings are not children.

Drawings are not real.

Half the goddamn point of saying CSAM instead of CP is to make clear that Bart Simpson doesn’t count. Bart Simpson is not real. It is fundamentally impossible to violate Bart Simpson’s rights, because he doesn’t fucking exist. There is nothing to protect him from. He cannot be harmed. He is imaginary.

This cannot be a controversial statement. Anyone who can’t distinguish fiction from real life has brain problems.

You can’t rape someone in MS Paint. Songs about murder don’t leave a body. If you write about robbing Fort Knox, the gold is still there. We’re not about to arrest Mads Mikkelsen for eating people. It did not happen. It was not real.

If you still want to get mad at people for jerking off to the wrong fantasies, that is an entirely different problem from photographs of child rape.

permalink
report
parent
reply
5 points

Oh, wait, Japanese in the other comment, now I get it. This conversation is a about AI Loli porn.

Pfft, of course, that’s why no one is saying the words they mean, because it suddenly becomes much harder to take the stance since hatred towards Loli Porn is not universal.

permalink
report
parent
reply
4 points

I mean, I think it’s disgusting, but I don’t think it should be illegal. I feel the same way about cigarettes, 2 girls 1 cup, and profane language. It’s absolutely not for me, but that shouldn’t make it illegal.

As long as there’s no victim, knock yourself out with whatever disgusting, weird stuff you’re into.

permalink
report
parent
reply
5 points

You should keep reading then, because they cover that later.

permalink
report
parent
reply
3 points

What does that even mean?

There’s nothing to “cover.” They’re talking about illustrations of bad things, alongside actual photographic evidence of actual bad things actually happening. Nothing can excuse that.

No shit they are also discussing actual CSAM alongside… drawings. That is the problem. That’s what they did wrong.

permalink
report
parent
reply
1 point
*

Okay, thanks for the clarification

Everyone except you still very much includes drawn & AI pornographic depictions of children within the basket of problematic content that should get filtered out of federated instances so thank you very much but I’m not sure your point changed anything.

permalink
report
parent
reply
6 points

They are not saying it shouldn’t be defederated, they are saying reporting this to authorities is pointless and that considering CSAM is harmful.

permalink
report
parent
reply
-1 points

If you don’t think images of actual child abuse, against actual children, is infinitely worse than some ink on paper, I don’t care about your opinion of anything.

You can be against both. Don’t ever pretend they’re the same.

permalink
report
parent
reply
1 point

Oh no, what you describe is definitely illegal here in Canada. CSAM includes depictions here. Child sex dolls are illegal. And it should be that way because that stuff is disgusting.

permalink
report
parent
reply
-2 points
*

CSAM includes depictions here.

Literally impossible.

Child rape cannot include drawings. You can’t sexually assault a fictional character. Not “you musn’t.” You can’t.

If you think the problem with child rape amounts to ‘ew, gross,’ fuck you. Your moral scale is broken, if there’s not a vast gulf between those two bad things.

permalink
report
parent
reply
149 points

Mastodon is a piece of software. I don’t see anyone saying “phpBB” or “WordPress” has a massive child abuse material problem.

Has anyone in the history ever said “Not a good look for phpBB”? No. Why? Because it would make no sense whatsoever.

I feel kind of a loss for words because how obvious it should be. It’s like saying “paper is being used for illegal material. Not a good look for paper.”

What is the solution to someone hosting illegal material on an nginx server? You report it to the authorities. You want to automate it? Go ahead and crawl the web for illegal material and generate automated reports. Though you’ll probably be the first to end up in prison.

permalink
report
reply
34 points

I get what you’re saying, but due to federated nature, those CSAMs can easily spread to many instances without their admins noticing them. Having even one CSAM in your server is a huge risk for the server owner.

permalink
report
parent
reply
30 points

I don’t see what a server admin can do about it other than defederate the instant they get reports. Otherwise how can they possibly know?

permalink
report
parent
reply
-5 points

This could be a really big issue though. People can make instances for really hateful and disgusting crap but even if everyone defederates from them it’s still giving them a platform, a tiny tiny corner on the internet to talk about truly horrible topics.

permalink
report
parent
reply
-1 points

Thats a dumb argument, though.

phpbb is not the host or the provider. Its just something you download and install on your server, with the actual service provider (You, the owner of the server and operator of the phpbb forum) being responsible for its content and curation.

Mastadon/Twitter/social media is the host/provider/moderator.

permalink
report
parent
reply
10 points
*
Deleted by creator
permalink
report
parent
reply
78 points

According to corporate news everything outside of the corporate internet is pedophiles.

permalink
report
reply
30 points

Well, terrorists became boring, and they still want the loony wing of the GOP’s clicks, so best to back off on Nazis and pro-Russians, leaving pedophiles as the safest bet.

permalink
report
parent
reply
3 points

Nazis not being the go-to target for a poisoning the well approach worries me in many different levels

permalink
report
parent
reply
3 points

Agreed. I’m in my 40s, and I’ve never seen anywhere near the level of subsurface signaling and intentional complacency we’re experiencing now.

permalink
report
parent
reply
55 points
*

Nothing you can do except go after server owners like usual. Has nothing to do with the fedi. Mastodon has nothing to do with either because anyone can pop up their own alternative server. This is one of many protocols they have or will use to distribute this stuff.

This just in: criminals are using the TCP protocol to distribute CP!!! What can the internet do to stop this? Oh yeah, go after server owners and groups like usual.

permalink
report
reply
11 points
*

Things are a bit complicated in the fediverse. Sure, your instance might not host any pedo community, but if a user on your instance subscribe/interact with those community, the CSAMs might get federated into your instance without you noticing. There are tools to help you combat this, but as an instance owner you can’t just assume it’s not your problem if some other instance host pedo stuff.

permalink
report
parent
reply
6 points

That is definitely alarming, and a downside of the fedi, but seems like a necessary evil. Unfortunately admins and mods of small communties in the fedi will be the ones exposed to this. There has been better methods if handling this though. There are shared block lists out there and they already have lists that block out undesirable stuff like that, so it at least minimizes the amount of innocent eyes of mods, who are just regular unpaid people, from seeing disgusting stuff. Also, obviously those instances should be reported to the police, fbi, or whatever the heck

permalink
report
parent
reply
4 points

There is a database of known files of CSAM and their hashes, mastodon could implement a filter for those at the posting interaction and when federating content

Shadow banning those users would be nice too

permalink
report
parent
reply
1 point

They are talking about AI generated images. That’s the volume part.

permalink
report
parent
reply

Technology

!technology@lemmy.ml

Create post

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

Community stats

  • 3.5K

    Monthly active users

  • 2.9K

    Posts

  • 45K

    Comments

Community moderators