I noticed a bit of panic around here lately and as I have had to continuously fight against pedos for the past year, I have developed tools to help me detect and prevent this content.

As luck would have it, we recently published one of our anti-csam checker tool as a python library that anyone can use. So I thought I could use this to help lemmy admins feel a bit more safe.

The tool can either go through all your images via your object storage and delete all CSAM, or it canrun continuously and scan and delete all new images as well. Suggested option is to run it using --all once, and then run it as a daemon and leave it running.

Better options would be to be able to retrieve exact images uploaded via lemmy/pict-rs api but we’re not there quite yet.

Let me know if you have any issue or improvements.

EDIT: Just to clarify, you should run this on your desktop PC with a GPU, not on your lemmy server!

105 points

Hey @db0@lemmy.dbzer0.com, just so you know, this tool is most likely very illegal to use in the USA. Something that your users should be aware of. I don’t really have the energy to go into it now, but I’ll post what I told my users in the programming.dev discord:

that is almost definitely against the law in the USA. From what I’ve read, you have to follow very specific procedures to report CSAM as well as retain the evidence (yes, you actually have to keep the pictures), until the NCMEC tells you you should destroy the data. I’ve begun the process to sign up programming.dev (yes you actually have to register with the government as an ICS/ESP) and receive a login for reports.

If you operate a website, and knowingly destroy the evidence without reporting it, you can be jailed. It’s quite strange, and it’s quite a burden on websites. Funnily enough, if you completely ignore your website, so much so that you don’t know that you’re hosting CSAM then you are completely protected and have no obligation to report (in the USA at least)

Also, that script is likely to get you even more into trouble because you are knowingly transmitting CSAM to ‘other systems’, like dbzer0’s aihorde cluster. that’s pretty dang bad…

here are some sources:

permalink
report
reply
55 points
*

Note that the script I posted is not transmitting the images to the AI Horde.

Also keep in mind this tool is fully automated and catches a lot of false positives (due to the nature of the scan, it couldn’t be otherwise). So one could argue it’s a generic filtering operation, not an explicit knowledge of CSAM hosting. But IANAL of course.

This is unlike cloudflare or other services which compare with known CSAM.

EDIT: That is to mean, if you use this tool to forward these images to the govt, they are going to come after you for spamming them with garbage

permalink
report
parent
reply
28 points
*

Cloudflare still has false positives, the NCMEC does not care if they get false positives. If you read some of those links I provided it wouldn’t be considered a generic filtering operation, from how I’m reading it at least. I wouldn’t take the chance, especially not with running the software on your own hardware in your own house, split from the server.

I think you’re not in the US? So it’s probably different for your jurisdiction. Just want to make it clear that in the US, from what i’ve read up on, this would be considered against the law. You are running software to filter for CSAM, so you are obligated to report it. Up to 1 year jail time for not doing so.

permalink
report
parent
reply
18 points

Nothing that can’t be fixed by adding a quarantine option instead of deleting the offending picture. Hopefully someone can upload a patch for that?

permalink
report
parent
reply
16 points

One can easily hook this script to forward to whoever is needed, but I think they might be a bit annoyed after you send them a couple hundred thousand false positives without any csam.

permalink
report
parent
reply
18 points

Ugh, what a mess. Thought about this for a while today and three thoughts started circulating in my head:

  1. Hire an actual lawyer and get firm legal advice on this issue. I think this would fall to the admins, not the devs. Maybe an admin who wanted could volunteer to contact a lawyer? We could do a gofundme for one-time consultation legal fees.

  2. Stop using pictrs completely and instead use links to a third party such as Imgur or whatever. They’re in this business and I’m sure already have dealt with it and have a solution. Yes it sucks that Imgur (or whatever third party) could delete our legitimate images at any time, but IMHO it’s worth it to avoid this headache. At any rate it offloads the liability from an admin. Of course, IANAL and this is a question we would want to ask a lawyer about.

  3. Needing a GPU increases the expenses for an admin significantly. It will start to not be worth it for quite a few to keep their instance running.

Thanks for bringing up this point. This is obviously a nuanced issue that is going to need a well-thought-out solution.

permalink
report
parent
reply
5 points

The GPU doesn’t have to be high-end, and can run on someone’s PC

permalink
report
parent
reply
3 points

Depending on the country, those laws may be different. Here is a story of a guy who ran a TOR exit node in Australia who would have been protected as a company (law was later changed).

https://lowendbox.com/blog/man-found-guilty-of-child-porn-because-he-ran-a-tor-exit-node-the-story-of-william-weber/

permalink
report
parent
reply
11 points
*
Deleted by creator
permalink
report
parent
reply
8 points

the ridiculous part of it is, as I understand it, if you completely ignore your website and essentially never know that you’re hosting CSAM then you cannot be held liable for it. But then, someone’s probably literally gonna come hunt you down to tell you in person (FBI) lol. So probably best to not ignore it.

permalink
report
parent
reply
77 points

This is extremely cool.

Because of the federated nature of Lemmy many instances might be scanning the same images. I wonder if there might be some way to pool resources that if one instance has already scanned an image some hash of it can be used to identify it and the whole AI model doesn’t need to be rerun.

Still the issue of how do you trust the cache but maybe there’s some way for a trusted entity to maintain this list?

permalink
report
reply
18 points
*

How about a federated system for sharing “known safe” image attestations? That way, the trust list is something managed locally by each participating instance.

Edit: thinking about it some more, a federated image classification system would allow some instances to be more strict than others.

permalink
report
parent
reply
28 points

I think building such a system of some kind that can allow smaller instances to rely from help from larger instances would be extremely awesome.

Like, lemmy has the potential to lead the fediverse is safety tools if we put the work in.

permalink
report
parent
reply
15 points

Consensus algorithms. But it means there will always be duplicate work.

No way around that unfortunately

permalink
report
parent
reply
9 points
*

Why? Use something like RAFT, elect the leader, have the leader run the AI tool, then exchange results, with each node running it’s own subset of image hashes.

That does mean you need a trust system, though.

permalink
report
parent
reply
13 points

I’d rather have a text-only instance with no media at all. Can this be done?

permalink
report
parent
reply
17 points

Yes it is definitely possible! Just have no pictrs installed/running with the server. Note it will still be possible to link external images.

permalink
report
parent
reply
12 points

My understanding was it’s bad practice to host images on Lemmy instances anyway as it contributes to storage bloat. Instead of coming up with a one-off script solution (albeit a good effort), wouldn’t it make sense to offload the scanning to a third party like imgur or catbox who would already be doing that and just link images into Lemmy? If nothing else wouldn’t that limit liability on the instance admins?

permalink
report
parent
reply
11 points

TBH, I wouldn’t be comfortable outsourcing the scanning like that if I were running an instance. It only takes a bit of resources to know that you have done your due diligence. Hopefully this can get optimized to get time to be faster.

permalink
report
parent
reply
47 points

As a test, I ran this on a very early backup of lemm.ee images from when we had very little federation and very little uploads, and unfortunately it is finding a whole bunch of false positives. Just some examples it flagged as CSAM:

  • Calvin and Hobbes comic
  • The default Lemmy logo
  • Some random user’s avatar, which is just a digital drawing of a person’s face
  • a Pikachu image

Do you think the parameters of the script should be tuned? I’m happy to test it further on my backup, as I am reasonably certain that it doesn’t contain any actual CSAM

permalink
report
reply
52 points
*

This is normal . You should be worried if it wasn’t catching any false positives as it would mean a lot of false negatives would slip though. I am planning to add args to make it more or less severe, but I it will never be perfect. So long as it’s not catching most images, and of the false positives most are porn or contain children, I consider with worthwhile.

I’ll let you know when the functionality for he severity is updated

permalink
report
parent
reply
38 points

How do you even safely test scripts/tools like this 😵‍💫

permalink
report
reply
28 points

I’d bet there’s a CSAM test image dataset with innocuous images that get picked up by the script. Not sure how the system works, but if it’s through hashes then it would be pretty simple to add that to the script.

permalink
report
parent
reply
36 points
*

Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I’ve seen in this thread:

Fewer Letters More Letters
CF CloudFlare
CSAM Child Sexual Abuse Material
DNS Domain Name Service/System
HTTP Hypertext Transfer Protocol, the Web
nginx Popular HTTP server

4 acronyms in this thread; the most compressed thread commented on today has 6 acronyms.

[Thread #88 for this sub, first seen 28th Aug 2023, 22:25] [FAQ] [Full list] [Contact] [Source code]

permalink
report
reply
8 points

Good bot

permalink
report
parent
reply
8 points

The JustNoMILers need this bot

permalink
report
parent
reply
1 point
*
Deleted by creator
permalink
report
parent
reply

Selfhosted

!selfhosted@lemmy.world

Create post

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don’t control.

Rules:

  1. Be civil: we’re here to support and learn from one another. Insults won’t be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it’s not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don’t duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

Community stats

  • 4.7K

    Monthly active users

  • 3.2K

    Posts

  • 71K

    Comments