Not the best news in this report. We need to find ways to do more.

You are viewing a single thread.
View all comments
32 points

Why would someone downvote this post? We have a problem and it’s in our best interest to fix that.

permalink
report
reply
54 points

The report (if you can still find a working link) said that the vast majority of material that they found was drawn and animated, and hosted on one Mastodon instance out of Japan, where that shit is still legal.

Every time that little bit of truth comes up, someone reposts the broken link to the study, screaming about how it’s the entire Fediverse riddled with child porn.

permalink
report
parent
reply
18 points

So basically we had a bad apple that was probably already defederated by everyone else.

permalink
report
parent
reply
17 points

Moreso an apple with controversial but not strictly CSAM material based in a country where it’s content is legal. Actually, not even an apple; Lemmy and the fediverse aren’t an entity. It’s an open standard for anyone to use; you don’t see the Modern Language Association being blamed for plagiarized essays written in MLA format, or the WHATWG being blamed because illegal sites are written in HTML, so it’s not a fair comparison to say that Lemmy/the fediverse are responsible for what people do with their open standard either

permalink
report
parent
reply
7 points

It’s Pawoo, Pixiv’s (formerly) own instance, which is infamous for this kind of content, and those are still “just drawings” (unless some artists are using illegal real-life references).

permalink
report
parent
reply
6 points

Here’s a link to the report: https://stacks.stanford.edu/file/druid:vb515nd6874/20230724-fediverse-csam-report.pdf
It is from 2023-07-24, so there’s a considerable chance it is not the one you were thinking about?

permalink
report
parent
reply
14 points

Since the release of Stable Diffusion 1.5, there has been a steady increase in the
prevalence of Computer-Generated CSAM (CG-CSAM) in online forums, with
increasing levels of realism.17 This content is highly prevalent on the Fediverse,
primarily on servers within Japanese jurisdiction.18 While CSAM is illegal in
Japan, its laws exclude computer-generated content as well as manga and anime.

Nope, seems to be the one. They lump the entire Fediverse together, even though most of the shit they found was in Japan.

The report notes 112 non-Japanese items found, which is a problem, but not a world shaking issue. There may be issues with federation and deletion orders, which is also an issue, but not a massive world shaking one.

Really, what the report seems to be about is the fact that moderation is hard. Bad actors will work around any moderation you put in place, so it’s a constant game of whack-a-mole. The report doesn’t understand this basic fact and pretends that no one is doing any moderation, and then they add in Japan.

permalink
report
parent
reply
3 points

It’s 4Chan’s “9000 Penises” all over again

permalink
report
parent
reply
23 points
*

The study doesn’t compare their findings to any other platform, so we can’t really tell if those numbers are good or bad. They just state the absolute numbers, without really going into to much detail about their searching process. So no, you can’t draw the conclusion that the Fediverse has a CSAM problem, at least not from this study.

Of course that makes you wonder why they bothered to publish such a lackluster and alarmistic study.

permalink
report
parent
reply
2 points
*

Pretty sure any quantity of CSAM that isnt zero is bad…

permalink
report
parent
reply
15 points
*

Yes, but you don’t need a study to know that there will be some CSAM on the Fediverse. This is about whether using the Fediverse could make things worse, and just stating absolute numbers won’t answer that question. What it does do is make it sound like the Fediverse is a magnet for CSAM, when in reality the opposite could be true.

Or to put it differently: If these numbers turn out to be lower than what you get on similar platforms, then this could actually be a good sign, even though they still aren’t zero.

permalink
report
parent
reply
2 points

Its not our responsibility. As long as moderators are reporting it as they come across it and the fbi is looking into it then what more can be done? Google, Twitter, discord, Instagram, reddit have 10000x the amount of csam.

permalink
report
parent
reply
13 points

Because it’s another “WON’T SOMEONE THINK OF THE CHILDREN” hysteria bait post.

They found 112 images of cp in the whole Fediverse. That’s a very small number. We’re doing pretty good.

permalink
report
parent
reply
5 points

It is not “in the whole fediverse”, it is out of approximately 325,000 posts analyzed over a two day period.
And that is just for known images that matched the hash.

Quoting the entire paragraph:

Out of approximately 325,000 posts analyzed over a two day period, we detected
112 instances of known CSAM, as well as 554 instances of content identified as
sexually explicit with highest confidence by Google SafeSearch in posts that also
matched hashtags or keywords commonly used by child exploitation communities.
We also found 713 uses of the top 20 CSAM-related hashtags on the Fediverse
on posts containing media, as well as 1,217 posts containing no media (the text
content of which primarily related to off-site CSAM trading or grooming of minors).
From post metadata, we observed the presence of emerging content categories
including Computer-Generated CSAM (CG-CSAM) as well as Self-Generated CSAM
(SG-CSAM).

permalink
report
parent
reply
6 points

How are the authors distinguishing between posts made by actual pedophiles and posts by law enforcement agencies known to be operating honeypots?

permalink
report
parent
reply

Still, that number should be zero.

permalink
report
parent
reply
3 points

In an ideal world sense, I agree with you - nobody should abuse children, so media of people abusing children should not exist.

In a practical sense, whether talking about moderation or law enforcement, a rate of zero requires very intrusive measures such as moderators checking every post before others are allowed to see it. There are contexts in which that is appropriate, but I doubt many people would like it for the Fediverse at large.

permalink
report
parent
reply
4 points

Because it’s talking about a report without linking to any report. That’s shouting into the void at best, clickbait at worst.

permalink
report
parent
reply
4 points

I was able to click and access the report fine

permalink
report
parent
reply
4 points

Because its literally nothing a normie would read through. And some thought lemmy had bad ui.

permalink
report
parent
reply
4 points

Given new commercial entrants into the Fediverse such as WordPress, Tumblr and Threads, we suggest collaboration among these parties to help bring the trust and safety benefits currently enjoyed by centralized platforms to the wider Fediverse ecosystem.

Because the solution sucks?

permalink
report
parent
reply

Fediverse

!fediverse@lemmy.world

Create post

A community to talk about the Fediverse and all it’s related services using ActivityPub (Mastodon, Lemmy, KBin, etc).

If you wanted to get help with moderating your own community then head over to !moderators@lemmy.world!

Rules

  • Posts must be on topic.
  • Be respectful of others.
  • Cite the sources used for graphs and other statistics.
  • Follow the general Lemmy.world rules.

Learn more at these websites: Join The Fediverse Wiki, Fediverse.info, Wikipedia Page, The Federation Info (Stats), FediDB (Stats), Sub Rehab (Reddit Migration), Search Lemmy

Community stats

  • 5.3K

    Monthly active users

  • 1.9K

    Posts

  • 65K

    Comments