New Mexico is seeking an injunction to permanently block Snap from practices allegedly harming kids. That includes a halt on advertising Snapchat as “more private” or “less permanent” due to the alleged “core design problem” and “inherent danger” of Snap’s disappearing messages. The state’s complaint noted that the FBI has said that “Snapchat is the preferred app by criminals because its design features provide a false sense of security to the victim that their photos will disappear and not be screenshotted.”
I don’t understand how this would be fine but pedophiles generating them at home without distributing them would be illegal.
Cause the cops are creating CSAM and putting it out there in some shape or form, which you could argue would encourage pedophiles same way circulating ai images would as well.
Either generating under age sensitive material is always wrong or it’s not.
Federal law is creating fictional CSAM at home without transmitting it is legal.
The few AI arrests I’ve seen they transmitted them.
https://en.wikipedia.org/wiki/Legal_status_of_fictional_pornography_depicting_minors
Section 1466A of Title 18, United States Code, makes it illegal for any person to knowingly produce, distribute, receive, or possess with intent to transfer or distribute visual representations, such as drawings, cartoons, or paintings that appear to depict minors engaged in sexually explicit conduct and are deemed obscene.
Specifically: distribute, receive, or possess with intent to transfer or distribute visual representations
By the statute’s own terms, the law does not make all fictional child pornography illegal, only that found to be obscene or lacking in serious value. The mere possession of said images is not a violation of the law unless it can be proven that they were transmitted through a common carrier, such as the mail or the Internet, transported across state lines, or of an amount that showed intent to distribute.[135]
Edit: So it has to be 100% locally generated and never transmitted. You still wouldn’t want to try and fight this in court though, I’m sure they’d do their best to throw you in jail, or fabricate an intent to distribute if you’d made a lot, even with all the images you generated to try and get the images you actually wanted.
Edit: Also this is federal, there may be other state laws.
The article didn’t say the cops generated AI CSAM. It said they created a profile pic, which was shown in the article.
So if someone generates a minor’s image and it’s not nude, is that not CSAM?
I’m genuinely asking, I always thought it was about sexualizing children, not whether they are nude or not.
I don’t think so. People keep throwing that acronym around but I suspect they didn’t read the article and find out that it was one normal picture of a high school-aged girl.
In an alternate universe:
Cops help pedophiles with AI pics of teen girl. Ethical triumph or new disaster?
I feel like there’s two stories here:
-
Snap is a pedo grooming ground
-
Police are generating fake CSAM
-
is, well, yeah . Like any major network you have creeps, Nazis and pedos and it’s good to root them out. It’s weird that themselves don’t do that.
-
is … difficult. I feel like the cops that prompt GenAIs to produce (sexualized) images of children risk severe trauma. It must be punishing to ask a a machine to produce fake images of morally reprehensible material. That’s gotta take a toll on you. What’s even more important is that those people could be spending their time taking down actual child abuse images.
I can’t wrap my head around that.
In general I think using AI imagery, and catfishing in general, is basically entrapment. In most civilized countries; that’s illegal for police to do.
Now if they begin to actually trade in actually legitimate forbidden materials…sure; by all means arrest for that charge alone. That wouldn’t be unjustified. But provoking someone who might then turn around and harm a real child, seems wrong.
So glad I got rid of that bullshlt long ago.