A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

107 points
*

It’s hard to have a nuanced discussion because the article is so vague. It’s not clear what he’s specifically been charged with (beyond “obscenity,” not a specific child abuse statute?). Because any simulated CSAM laws have been, to my knowledge, all struck down when challenged.

I completely get the “lock them all up and throw away the key” visceral reaction - I feel that too, for sure - but this is a much more difficult question. There are porn actors over 18 who look younger, do the laws outlaw them from work that would be legal for others who just look older? If AI was trained exclusively on those over-18 people, would outputs then not be CSAM even if the images produced features that looked under 18?

I’m at least all for a “fruit of the poisoned tree” theory - if AI model training data sets include actual CSAM then they can and should be made illegal. Deepfaking intentionally real under 18 people is also not black and white (looking again to the harm factor), but also I think it can be justifiably prohibited. I also think distribution of completely fake CSAM can be arguably outlawed (the situation here), since it’s going to be impossible to tell AI from real imagery soon and allowing that would undermine enforcement of vital anti-real-CSAM laws.

The real hard case is producing and retaining fully fake people and without real CSAM in training data, solely locally (possession crimes). That’s really tough. Because not only does it not directly hurt anyone in its creation, there’s a possible benefit in that it diminishes the market for real CSAM (potentially saving unrelated children from the abuse flowing from that demand), and could also divert the impulse of the producer from preying on children around them due to unfulfilled desire.

Could, because I don’t think there’s studies that answers whether those are true.

permalink
report
reply
28 points

I mostly agree with you, but a counterpoint:

Downloading and possession of CSAM seems to be a common first step in a person initiating communication with a minor with the intent to meet up and abuse them. I’ve read many articles over the years about men getting arrested for trying to meet up with minors, and one thing that shows up pretty often in these articles is the perpetrator admitting to downloading CSAM for years until deciding the fantasy wasn’t enough anymore. They become comfortable enough with it that it loses its taboo and they feel emboldened to take the next step.

CSAM possession is illegal because possession directly supports creation, and creation is inherently abusive and exploitative of real people, and generating it from a model that was trained on non-abusive content probably isn’t exploitative, but there’s a legitimate question as to whether we as a society decide it’s associated closely enough with real world harms that it should be banned.

Not an easy question for sure, and it’s one that deserves to be answered using empirical data, but I imagine the vast majority of Americans would flatly reject a nuanced view on this issue.

permalink
report
parent
reply
30 points
*

The problem is empirical data cannot be morally or ethically found. You can’t show a bunch of people porn and then make a statistical observation of whether those shown child porn are more likely to assault children. So we have to go forward without that data.

I will anecdotally observe anal sex, oral sex, and facials have gone up between partners as prevalence in porn has gone up. That suggests but does not prove a direct statistical harm caused by even “ethically produced CSAM.”

permalink
report
parent
reply
27 points

I will anecdotally observe anal sex, oral sex, and facials have gone up between partners as prevalence in porn has gone up. That suggests but does not prove a direct statistical harm caused by even "ethically produced CSAM

Can we look at trends between consenting adults (who are likely watching porn of real people by the way) as an indicator of what pedophiles will do? I’m not so sure. It’s not like step sibling sex is suddenly through the roof now with it being the “trend” in porn.

Looking specifically at fake rape porn maybe and seeing if it increases rates of rape in the real world might be a better indicator.

permalink
report
parent
reply
8 points
*

True, it wouldn’t be ethical to conduct an experiment, but we can (and probably do) collect lots of observational data that can provide meaningful insight. People are arrested at all stages of CSAM related offenses from just possession, distribution, solicitation, and active abuse.

While observation and correlations are inherently weaker than experimental data, they can at least provide some insight. For example, “what percentage of those only in possession of artificially generated CSAM for at least one year go on to solicit minors” vs. “real” CSAM.

If it seems that artificial CSAM is associated with a lower rate of solicitation, or if it ends up decreasing overall demand for “real” CSAM, then keeping it legal might provide a real net benefit to society and its most vulnerable even if it’s pretty icky.

That said, I have a nagging suspicion that the thing many abusers like most about CSAM is that it’s a real person and that the artificial stuff won’t do it for them at all. There’s also the risk that artificial CSAM reduces the taboo of CSAM and can be an on-ramp to more harmful materials for those with pedophilic tendencies that they otherwise are able to suppress. But it’s still way too early to know either way.

permalink
report
parent
reply
17 points

CSAM possession is illegal because possession directly supports creation

To expound on this: prior to this point, the creation of CSAM requires that children be sexually exploited. You could not have CSAM without children being harmed. But what about when no direct harms have occurred? Is lolicon hentai ‘obscene’? Well, according to the law and case law, yes, but it’s not usually enforced. If we agree that drawings of children engaged in sexual acts aren’t causing direct harm–that is, children are not being sexually abused in order to create the drawings–then how much different is a computer-generated image that isn’t based off any specific person or event? It seem to me that, whether or not a pedophile might decide that they eventually want more than LLM-generated images is not relevant. Treating a future possibility as a foregone conclusion is exactly the rationale behind Reefer Madness and the idea of ‘gateway’ drugs.

Allow me to float a second possibility that will certainly be less popular.

Start with two premises: first, pedophilia is a characteristic that appears to be an orientation. That is, a true pedophile–a person exclusively sexually attracted to pre-pubescent children–does not choose to be a pedophile, any more than a person chooses to be gay. (My understanding is that very few pedophiles are exclusively pedophilic though, and that many child molesters are opportunistic sexual predators rather than being pedophiles.) Secondly, the rates of sexual assault appear to have decreased as pornography availability has increased. So the question I would have is, would wide availability of LLM-generated CSAM–CSAM that didn’t cause any real, direct harm to children–actually decrease rates of child sexual assault?

permalink
report
parent
reply
5 points

With regards to your last paragraph: Pedophiles can indeed by straight, gay or bi. Pedophiles may also not become molesters, and molesters of children may not at all be pedophilic. It’s seems you understand this. I mentioned ITT that I read a newspaper article many years ago that was commissioned to show the access to cp would increase child abuse, it seemed to show the opposite.
If persons could use AI to generate their own porn of their own personal fantasies (whatever those might be) and NOT share that content what then? Canada allows this for text (maybe certain visuals? Audio? IDK). I don’t know about current ‘obscene’ laws in the USA, however, I do recall reading about an art exhibit in NY which featured an upside down urinal that was deemed obscene, than later deemed a work or art. I also recall seeing (via an internet image) a sculpture of what seemed to be a circle of children with penises as noses. Porn? Art? Comedy?

permalink
report
parent
reply
4 points
*

Hard to say. I generally agree with what you’ve said though. Also, lots of people have other fantasies that they would never enact in real life for various reasons (e.g. it’s unsafe, illegal, or both; edit: I should also absolutely list non-consensual here). I feel like pedophilia isn’t necessarily different.

However part of the reason loli/whatever is also illegal to distribute (it is, right? I assume it is at least somewhere) is that otherwise it helps people facilitate/organize distribution of real CSAM, which increases demand for it. That’s what I’ve heard at least and it makes sense to me. And I feel like that would apply to AI generated as well.

permalink
report
parent
reply
12 points
*

Downloading and possession of CSAM seems to be a common first step in a person initiating communication with a minor with the intent to meet up and abuse them.

But this is like the arguments used to say that weed is a “gateway drug” by talking about how people strung out on harder drugs almost always have done weed as well, ignoring everyone who uses only weed. But this is even hazier because we literally have no real idea how many people consume that stuff but don’t ‘escalate’.

I remember reading once in some research out of Japan that child molesters consume less porn overall than the average citizen, which seems counter-intuitive, but may not be, if you consider the possibility that maybe it (in this case, they were talking primarily about manga with anime-style drawings of kids in sexual situations) is actually curbing the incidence of the ‘real thing’, since the ones actually touching kids in the real world are reading those mangas less.

I’m also reminded of people talking about sex dolls that look like kids, and if that’s a possible ‘solution’ for pedophiles, or if it would ‘egg on’ actual molestation.

I think I lean on the side of ‘satiation’, from the limited bits of idle research I’ve done here and there. And if that IS in fact the case, then regardless of if it grosses me out, I can’t in good conscience oppose something that actually reduces the number of children who actually get abused, you know?

permalink
report
parent
reply
-3 points
*

It’s less that these materials are like a “gateway” drug and more like these materials could be considered akin to advertising. We already have laws about advertising because it’s so effective, including around cigarettes and prescriptions.

Second, the role that CP plays in most countries is difficult. It is used for blackmail. It is also used to generate money for countries. And it’s used as advertising for actual human trafficking organizations. And similar organizations exist for snuff and gore btw. And ofc animals. And any combination of those 3. Or did you all forget about those monkey torture videos, or the orangutan who was being sex trafficked? Or Daisy’s Destruction and Peter Scully?

So it’s important to not allow these advertisers to combine their most famous monkey torture video with enough AI that they can say it’s AI generated, but it’s really just an ad for their monkey torture productions. And they do that with CP, rape, gore, etc, too.

permalink
report
parent
reply
4 points

but there’s a legitimate question as to whether we as a society decide it’s associated closely enough with real world harms that it should be banned.

Why should that be a question at all? If it causes harm, ban it. If not, don’t. Being “associated with” should never be grounds for a legal statute.

permalink
report
parent
reply
1 point

generally a very good point, however i feel it’s important to point out some important context here:

the pedophiles you’re talking about in your comment are almost always members of tight knit communities that share CSAM, organize distribution, share sources, and most importantly, indulge their fantasies/desires together.

i would think that the correlation that leads to molestation is not primarily driven by the CSAM itself, but rather the community around it.

we clearly see this happening in other similarly structured and similarly isolated communities: nazis, incels, mass shooters, religious fanatics, etc.

the common factor in radicalization and development of extreme views in all these groups is always isolation and the community they end up joining as a result, forming a sort of parallel society with it’s own rules and ideals, separate from general society. over time people in these parallel societies get used to seeing the world in a way that aligns with the ideals of the group.

nazis start to see anyone not part of their group as enemies, incels start to see “females” instead of women, religious fanatics see sinners…and pedophiles see objects that exist solely for their gratification instead of kids…

I don’t see why molesters should be any different in this aspect, and would therefore argue that it’s the communal aspect that should probably be the target of the law, i.e.: distribution and organization (forums, chatrooms, etc.)

the harder it is for them to organize, the less likely these groups are to produce predators that cause real harm!

if on top of that there is a legally available outlet where they can indulge themselves in a safe manner without harming anyone, I’d expect rates of child molestation to drop significantly, because, again, there’s precedence from similar situations (overdoses in drug addicts, for example)

i think it is a potentially fatal mistake to think of pedophiles as “special” cases, rather than just another group of outcasts, because in nearly all cases of such pariahs the solutions that prove to work best in the real world are the ones that make these groups feel less like outcasts, which limits avenues of radicalization.

i thought these parallels are something worth pointing out.

permalink
report
parent
reply
20 points

Even worse, you don’t need CSAM to start with. If a learning model has regular porn and nude reference model photography of people under 18 that are used for drawing anatomy, then they have enough information to combine the two. Hell, it probably doesn’t even need the people under 18 to actually be nude.

Hell, society tends to assume any nudity inder 18 to be CSAM anyway, because someone could see it that way.

permalink
report
parent
reply
14 points

I don’t know if it’s still a thing, but I’m reminded of some law or regulation that was passed a while back in Australia, iirc, that barred women with A-cup busts from working in porn, the “reasoning” being that their flatter chests made them look too similar to prepubescent girls, lol…

Not only stupid but also quite insulting to women, imo.

permalink
report
parent
reply
13 points

Because any simulated CSAM laws have been, to my knowledge, all struck down when challenged.

To the best of my knowledge, calling drawn works obscene has been upheld in courts, most often because the artist(s) lack the financial ability to fight the charges effectively. The artist for the underground comic “Boiled Angel” had his conviction for obscenity upheld–most CSAM work falls under obscenity laws–and ended up giving up the fight to clear his name.

permalink
report
parent
reply
6 points
*

Oh, for sure. I’m talking about laws specifically targeted to minors. “Obscenity” is a catch-all that is well-established, but if you are trying to protect children from abuse, it’s a very blunt instrument and not as effective as targeted abuse and trafficking statutes. The statutory schemes used to outlaw virtual CSAM have failed to my knowledge.

For example: https://en.wikipedia.org/wiki/Ashcroft_v._Free_Speech_Coalition

That case was statutorily superseded in part by the PROTECT Act, which attempted to differentiate itself by…relying on an obscenity standard. So it’s a bit illusory that it does anything new.

permalink
report
parent
reply
4 points

The PROTECT Act has been, so far, found to be constitutional, since it relies on the obscenity standard in regards to lolicon hentai. Which is quite worrisome. It seems like it’s a circular argument/tautology; it’s obscene for drawn art to depict child sexual abuse because drawings of child sexual abuse are obscene.

permalink
report
parent
reply
1 point

simulated CSAM

When I used this phrase, someone told me it described a nonexistent concept, and that the CSAM term existed in part to differentiate between content where children were harmed to make it versus not. I didn’t wanna muddy any waters but do you have an opposing perspective?

Deepfaking intentionally real under 18 people is also not black and white

Interesting. Sounds real bad. See what you mean about harm factor though.

permalink
report
parent
reply
-1 points

I’m at least all for a “fruit of the poisoned tree” theory - if AI model training data sets include actual CSAM then they can and should be made illegal.

Now all AI is illegal. It’s trained via scraping the internet, which will include CP as well as every other image.

permalink
report
parent
reply
1 point

There’s “copywrite illegal,” and then there’s “cp illegal.” Those are two very different things.

permalink
report
parent
reply
1 point

I don’t understand the relevance of your statement.

permalink
report
parent
reply
84 points

This creates a significant legal issue - AI generated images have no age, nor is there consent.

The difference in appearance between age 16 and 18 is minimal, but the legal difference is immense. This is based entirely on a concept that cannot apply.

How do you define what’s depicting a fictional child? Especially without including real adults? I’ve met people who believe that preferring a shaved pubic area is pedophilia. This is even though the vast majority of adult women do so. On the flip side, teenagers from the 70s and 80s would be mistaken for 40+ today.

Even the extremes aren’t clear. Adult star “Little Lupe”, who was 18+ in every single appearance, lacked most secondary sex characteristics. Experts testified in court that she could not possibly be an adult. Except she was, and there’s full documentation to prove it. Would AI trained exclusively on her work be producing CSAM?

permalink
report
reply
13 points
*

Well, also this is nothing new, unfortunately. See Lolis. Or maybe don’t…

permalink
report
parent
reply
-24 points
*

To paraphrase someone smarter than me, “I’ll know it when I see it.”

But naturally I don’t want to see it. One of the things I miss the least about reddit is the constant image posts of anime characters, who may be whatever age they say but which are clearly representative of very young girls with big tiddies bolted on. It’s gross, but it is also a problem thatsl’s more widespread and nebulous than most people are willing to admit.

permalink
report
parent
reply
102 points

“I’ll know it when I see it.”

I can’t think of anything scarier than that when dealing with the legality of anything.

permalink
report
parent
reply
28 points

I’m nearly 40 and still regularly get carded while other people out with me do not so it’s not just “we card everyone”. People are bad at judging age.

permalink
report
parent
reply
18 points

https://en.m.wikipedia.org/wiki/I_know_it_when_I_see_it

They really downplayed the criticism of the phrase in the article, it’s actually criticised quite often for being so subjective.

permalink
report
parent
reply
3 points

Sometimes something cant have a perfect definition. What’s the difference between a gulf, a bay, and a channel? Where does the shore line become a beach? When does an arid prairie become a desert? How big does a town have to grow before it becomes a city? At what point does a cult bevota religion?When does a murder become premeditated vs a crime of passion? When does a person become too drunk to give active consent? Human behavior is a million shades of gray, just like everytbing else we do, and the things that don’t fit into our clear definitions are where the law needs to be subjective.

permalink
report
parent
reply
17 points

Just when trying to guess someone’s age (we’ll assume completely family-friendly and above board), think back to high school. How old did you and your peers look? Now go take a look at high schoolers today. They probably seem a lot younger than you did. The longer it’s been (i.e. the older you are), the younger they look. Which means, “when I see it” depends entirely on the age of the viewer.

This isn’t even just about perception and memory- modern style is based on/influenced heavily by youth. It’s also continuing to move in the direction. This is why actors in their 30s - with carefully managed hair, skin, makeup, and wardrobe - have been able to convincingly portray high schoolers. This means that it’s not just you - teens really are looking younger each year. But they’re still the same age.

permalink
report
parent
reply
3 points

Wtf. Style is what makes kids look young or old to us because we have been heavily marketed to and follow trends. That’s why when the mullet/porn stache style came back, those Dahmer kids looked in their 40s.

You’re getting older each year so teens look younger to you.

Name even one actor in their thirties who convincingly played a high schooler. Literally who

permalink
report
parent
reply
77 points

I don’t see how children were abused in this case? It’s just AI imagery.

It’s the same as saying that people get killed when you play first person shooter games.

Or that you commit crimes when you play GTA.

permalink
report
reply
34 points

Then also every artist creating loli porn would have to be jailed for child pornography.

permalink
report
parent
reply
8 points
*
Removed by mod
permalink
report
parent
reply
19 points

But this is the US… and its kind of a double standard if you’re not arrested for drawing but for generating it.

permalink
report
parent
reply
-8 points
*

Not a great comparison, because unlike withh violent games or movies, you can’t say that there is no danger to anyone in allowing these images to be created or distributed. If they are indistinguishable from the real thing, it then becomes impossible to identify actual human victims.

There’s also a strong argument that the availability of imagery like this only encourages behavioral escalation in people who suffer from the affliction of being a sick fucking pervert pedophile. It’s not methadone for them, as some would argue. It’s just fueling their addiction, not replacing it.

permalink
report
parent
reply
-21 points

The difference is intent. When you’re playing a FPS, the intent is to play a game. When you play GTA the intent is to play a game.

The intent with AI generated CSAM is to watch kids being abused.

permalink
report
parent
reply
29 points

Whose to say there aren’t people playing games to watch people die?

permalink
report
parent
reply
-6 points

There may well be the odd weirdo playing Call of Duty to watch people die.

But everyone who watches CSAM is watching it to watch kids being abused.

permalink
report
parent
reply
10 points

When you’re playing a FPS, the intent is to watch people being murdered.

How is this argument any different?

permalink
report
parent
reply
3 points

Punishing people for intending to do something is punishing them for thought crimes. That is not the world I want to live in.

permalink
report
parent
reply
-1 points

This guy did do something - he either created or accessed AI generated CSAM.

permalink
report
parent
reply
-35 points

It’s just AI imagery.

Fantasising about sexual contact with children indicates that this person might groom children for real, because they have a sexual interest in doing so. As someone who was sexually assaulted as a child, it’s really not something that needs to happen.

permalink
report
parent
reply
84 points

indicates that this person might groom children for real

But unless they have already done it, that’s not a crime. People are prosecuted for actions they commit, not their thoughts.

permalink
report
parent
reply
65 points

I agree, this line of thinking quickly spirals into Minority Report territory.

permalink
report
parent
reply
22 points

Seems like then fantasizing about shooting people or carjacking or such indcates that person might do that activity for real to. There are a lot of car jackings nowadays and you know gta is real popular. mmmm. /s but seriously im not sure your first statement has merit. Especially when you look at where to draw the line. anime. manga. oil paintings. books. thoughts in ones head.

permalink
report
parent
reply
-10 points

If you’re asking whether anime, manga, oil paintings, and books glorifying the sexualization of children should also be banned, well, yes.

This is not comparable to glorifying violence, because real children are victimized in order to create some of these images, and the fact that it’s impossible to tell makes it even more imperative that all such imagery is banned, because the existence of fakes makes it even harder to identify real victims.

It’s like you know there’s an armed bomb on a street, but somebody else filled the street with fake bombs, because they get off on it or whatever. Maybe you’d say making fake bombs shouldn’t be illegal because they can’t harm anyone. But now suddenly they have made the job of law enforcement exponentially more difficult.

permalink
report
parent
reply
-15 points

If you want to keep people who fantasise about sexually exploiting children around your family, be my guest. My family tried that, and I was raped. I didn’t like that, and I have drawn my own conclusions.

permalink
report
parent
reply
-37 points

Well, the image generator had to be trained on something first in order to spit out child porn. While it may be that the training set was solely drawn/rendered images, we don’t know that, and even if the output were in that style, it might very well be photorealistic images generated from real child porn and run through a filter.

permalink
report
parent
reply
48 points

An AI that is trained on children and nude adults can infer what a nude child looks like without ever being trained specifically with those images.

permalink
report
parent
reply
-13 points

Your argument is hypothetical. Real world AI was trained on images of abused childen.

https://cyber.fsi.stanford.edu/news/investigation-finds-ai-image-generation-models-trained-child-abuse

permalink
report
parent
reply
48 points
*

How many corn dogs do you think were in the training data?

permalink
report
parent
reply
6 points

Wild corn dogs are an outright plague where I live. When I was younger, me and my buddies would lay snares to catch to corn dogs. When we caught one, we’d roast it over a fire to make popcorn. Corn dog cutlets served with popcorn from the same corn dog is popular meal, especially among the less fortunate. Even though some of the affluent consider it the equivalent to eating rat meat. When me pa got me first rifle when I turned 14, I spent a few days just shooting corn dogs.

permalink
report
parent
reply
-1 points
*

It didn’t generate what we expect and know a corn dog is.

Hence it missed because it doesn’t know what a “corn dog” is

You have proven the point that it couldn’t generate csam without some being present in the training data

permalink
report
parent
reply
34 points

Just say you don’t get how it works.

permalink
report
parent
reply
16 points
*

we don’t know that

might

Unless you’re operating under “guilty until proven innocent”, those are not reasons to accuse someone.

permalink
report
parent
reply
-38 points

How was the model trained? Probably using existing CSAM images. Those children are victims. Making derivative images of “imaginary” children doesn’t negate its exploitation of children all the way down.

So no, you are making false equivalence with your video game metaphors.

permalink
report
parent
reply
59 points

A generative AI model doesn’t require the exact thing it creates in its datasets. It most likely just combined regular nudity with a picture of a child.

permalink
report
parent
reply
-13 points

In that case, the images of children were still used without their permission to create the child porn in question

permalink
report
parent
reply
27 points

Can you or anyone verify that the model was trained on CSAM?

Besides a LLM doesn’t need to have explicit content to derive from to create a naked child.

permalink
report
parent
reply
6 points

No they are not.

permalink
report
parent
reply
-24 points

You’re defending the generation of CSAM pretty hard here in some vaguely “but no child we know of” being involved as a defense.

permalink
report
parent
reply
12 points

While i wouldn’t put it past Meta&Co. to explicitly seek out CSAM to train their models on, I don’t think that is how this stuff works.

permalink
report
parent
reply
2 points

Wrong.

permalink
report
parent
reply
-5 points

But the AI companies insist the outputs of these models aren’t derivative works in any other circumstances!

permalink
report
parent
reply
6 points

Cuz they’re not

permalink
report
parent
reply
63 points

Do we know that AI child porn is bad? I could believe it would get them in the mood for the real thing and make them do it more, and I could believe it would make them go “ok, itch scratched”, and tank the demand for the real stuff.

Depending on which way it goes, it could be massively helpful for protecting kids. I just don’t have a sense for what the effect would be, and I’ve never seen any experts weigh in.

permalink
report
reply
33 points

Do we know that AI child porn is bad? I could believe it would get them in the mood for the real thing and make them do it more, and I could believe it would make them go “ok, itch scratched”, and tank the demand for the real stuff.

From bits/articles I’ve seen here and there over the years about other things that are kind of in the same category (porn comics with child characters in them, child-shaped sex dolls), the latter seems to be more the case.

I’m reminded of when people were arguing that when Internet porn became widespread, the incidence of rape would go through the roof. And then literally the opposite happened. So…that pushes me toward hypothesizing that the latter is more likely to be the case, as well.

permalink
report
parent
reply
22 points

In Australia cartoon child porn is enforced in the same way as actual child porn. Not that it answers your question but it’s interesting.

I’d imagine for your question “it depends”, some people who would have acted on their urges may get their jollies from AI child porn, others who have never considered being pedophiles might find the AI child porn (assuming legal) and realise it’s something they were into.

I guess it may lower the production of real child porn which feels like a good thing. I’d hazard a guess that there are way more child porn viewers than child abusers.

permalink
report
parent
reply
10 points

In Australia a 30 year old woman cannot be in the porn industry if she has small breasts. That, and the cartoon ban both seem like overcompensating.

permalink
report
parent
reply
12 points

Nothing says “we’re protecting children” like regulating what adult women can do with their bodies.

Conservatives are morons, every time.

permalink
report
parent
reply
16 points

I seem to remember Sweden did a study on this, but I don’t really want to google around to find it for you. Good luck!

permalink
report
parent
reply
13 points

I’d like to know what psychologists think about it. My assumption is the former, it escalates their fantasizing about it and makes them more likely to attack a child.

There seems to be no way to conduct that experiment ethically, though.

permalink
report
parent
reply
1 point
Deleted by creator
permalink
report
parent
reply
0 points
Deleted by creator
permalink
report
parent
reply
13 points

Real question: “do we care if AI child porn is bad?” Based on most countries’ laws, no.

permalink
report
parent
reply
12 points

There’s like a lot of layers to it.

  • For some, it might actually work in the opposite direction, especially if paried with the wrong kind of community around it. I used to moderate anime communities, the amount of loli fans wanting to lower the age of consent to 12 or even lower was way too high, but they only called people opposed to loli as “real predators”, because they liked their middle-school tier arguments (which just further polarized the fandom when the culture wars started).
  • Even worse might be the more realistic depictions might actually work against that goal, while with (most) loli stuff, at least it’s obvious it’s drawn.
  • An often overseen issue is, data laundering. Just call your real CP AI generated, or add some GAI artifacts to your collection. Hungary bans too realistic drawings and paintings of that kind, because people even did that with traditional means, by creating as realistic tracings as possible (the calling CP “artistic nudes” didn’t work out here at least).
permalink
report
parent
reply
8 points

In Canada even animated cp is treated as the real deal

permalink
report
parent
reply
12 points
*

In Norway, imagining or describing acts with a 16-year old is CP, but having sex with a 16-year old is perfectly legal

permalink
report
parent
reply
3 points

Lol damn it Norway

permalink
report
parent
reply
5 points

You’re missing the point. They don’t care what’s more or less effective for helping kids. They want to punish people who are different. In this case nobody is really going to step up to defend the guy for obvious reasons. But the motivating concept is the same for conservatives.

permalink
report
parent
reply
4 points

There definitively is opportunity in controlled treatment. But I believe outside of that there are too many unknowns.

permalink
report
parent
reply
4 points

Wikipedia seems to suggest research is inconclusive whether consuming CSAM increases the likelihood of committing abuse.

permalink
report
parent
reply
3 points

There are literally mountains of evidence that suggest that normalizing child abuse in any fashion increases the rate at which children are actually abused, but it never stops there from being a highly upvoted comment suggesting that jacking it to simulated kids is some how a “release valve” for actual pedophilia, which makes absolutely no fucking sense given everything we know about human sexuality.

If this concept were true, hentai fans would likely be some of the most sexually well-adjusted people around, having tons of experience releasing their real-world sexual desires via a virtual medium. Instead, we find that these people objectify the living shit out of women, because they’ve adopted an insanely overidealized caricature of what a woman should look and act like that is completely divorced from reality.

permalink
report
parent
reply
-19 points

Depending on which way it goes, it could be massively helpful for protecting kids

Weeeelll, only until the AI model needs more training material…

permalink
report
parent
reply
20 points

That’s not how it works. The “generative” in “generative AI” is there for a reason.

permalink
report
parent
reply
5 points

You need more training material to train a new AI. Once the AI is there, it produce as many pictures as you want. And you can get good results even with models that can be run locally on a regular computer.

permalink
report
parent
reply
-2 points

I’m not sure if that is how it would work? But this is exactly the kind of thinking we need. Effects: intended plus unintended equals ???

permalink
report
parent
reply
56 points

Could this be considered a harm reduction strategy?

Not that I think CSAM is good in any way, but if it saves a child would it be worthwhile? Like if these pedos were to use AI images instead of actual CSAM would that be any better?

I’ve read that CSAM sites on the dark web number into the hundreds of thousands. I just wonder if it would be a less harmful thing since it’s such a problem.

permalink
report
reply
37 points

Many years ago (about 25) I read an article in a newspaper (idk the name, but it may have been the The Computer Paper, which is archived on line someplace}. This article noted that a study had been commissioned to show that cp access increases child abuse. The study seemed to show the opposite.

Here’s the problem with even AI generated cp: It might lower abuse in the beginning, but with increased access it would ‘normalise’ the perception of such conduct. This would likely increase abuse over time, even involving persons who may not have been so inclined otherwise.

This is all a very complex. A solution isn’t simple. Shunning things in anyway won’t help though, and that seems to be the current most popular way to deal with the issue.

permalink
report
parent
reply
25 points

Actual pedophiles (a lot of CSA is abuse of power, not pedophilia - though to be clear fuck abusers either way) have a high rate of suicidal ideation because they think its as fucked up as everyone else. Of course we can’t just say “sure AI material is legal now” but I could imagine a regulated system accessed via doctors akin to how controlled substances work.

People take this firm “kill em all” stance but these people just feel the way they do same as I do towards women or a gay man feels toward men. It just is what it is - we all generally agree gay isnt a choice and this is no different. As long as they dont act on it, I think we should be sympathetic and be open to helping them live a less tortured life.

I’m not 100% saying this is how we do it, but we should be open to exploring the issue instead of full stop demonization.

permalink
report
parent
reply
11 points

Dan Savage coined the term “gold star pedophile” in a column years ago, referring to people who acknowledge their attraction to children but never act on it by harming a child or accessing CSAM. I do feel bad for these people because there are no resources to help them. The only way they can access actual therapeutic resources for their condition is by offending and going to jail. If the pedophile goes to a therapist and confesses attraction to children, therapists are mandated reporters and will assume they’re going to act on it. An article I read a few years back interviewed members of an online community of non-offending pedophiles who essentially made their own support group since no one else will help them, and nearly all research on them is from a forensic (criminal) context.

There’s a pretty good article by James Cantor talking about dealing with pedophiles in a therapeutic context here.

Don’t get me wrong - I think offenders need to be punished for what they do. I unfortunately have a former best friend who has offended. He’s no longer in my life and never will be again. But I think we could prevent offenders from reaching that point and hurting someone if we did more research and found ways to stop them before it happened.

permalink
report
parent
reply
6 points

I agree for the most part, particularly that we should be open minded.

Obviously we don’t have much reliable data, which I think is critically important.

The only thing I world add is that, I’m not sure treating a desire for CSAM would be the same as substance abuse. Like “weaning an addict off CSAM” seems like a strange proposition to me.

permalink
report
parent
reply
18 points

“Normalized” violent media doesn’t seem to have increased the prevalence of real world violence.

permalink
report
parent
reply
4 points

I actually think video games reduce crime in general. Bad kids are now indoors getting thier thrills.

permalink
report
parent
reply
2 points

That makes sense. I don’t know what a better answer is, just thinking out loud.

permalink
report
parent
reply
13 points

You would think so but you basically are making a patch work version of the illicit actual media so it’s a dark dark gray area for sure.

permalink
report
parent
reply
2 points

Hmm ok. I don’t know much about AI.

permalink
report
parent
reply
4 points

Generative AI is basically just really overpowered text/image prediction. It fills in the words or pixels that make the most sense based on the data it has been fed, so to get AI generated CSAM…it had to have been fed some amount of CSAM at some point or it had to be heavily manipulated to generate the images in question.

permalink
report
parent
reply
7 points
*

I guess my question is does access to regular porn make people not want to have real sex with another person? Does it ‘scratch the itch’ so to speak? Could they go the rest of their life with only porn to satisfy them?

It depends on the person. I feel like most people would be unsatisfied with only porn, but that’s just anecdotal.

I honestly think ai generated csam isn’t something the world needs to be produced. It’s not contributing to society in any meaningful ways and pedophiles who don’t offend or hurt children need therapy, and the ones who do need jailtime(and therapy, but Im in the US so thats a whole other thing). They don’t ‘need’ porn.

My own personal take is that giving pedophiles csam that’s AI generated is like showing alcohol ads to alcoholics. Or going to the strip club if you’re a sex addict. It’s probably not going to lead to good outcomes.

permalink
report
parent
reply
2 points

Think of it this way - what if the government said one day: “All child porn made in the before this date is legal, all child porn made after this date is illegal”.

You would end up with a huge corpus of “legal” child porn that pedophiles could use as a release, but you could become draconian about the manufacture of new child porn. This would, theoretically, discourage new child porn from being created, because the risk is too high compared to the legal stuff.

Can you see the problem? That’s right, in this scenario, child porn is legal. That’s fucked up, and we shouldn’t do that, even if it is “simulated”, because fuck that.

permalink
report
parent
reply
1 point

You definitely have a good point. I was just thinking hopefully to reduce harm but obviously I don’t want it to be legal.

permalink
report
parent
reply
0 points

“Because fuck that” is not a great argument.

permalink
report
parent
reply
1 point
Deleted by creator
permalink
report
parent
reply
0 points

by the same metric, i wonder why not let convicts murderers and psichopaths work at Slaughterhouses

permalink
report
parent
reply
4 points

On the other hand, are people who work at slaughterhouses more likely to be murderers and psychopaths?

permalink
report
parent
reply
2 points

perhaps, but I said convicted.

permalink
report
parent
reply

News

!news@lemmy.world

Create post

Welcome to the News community!

Rules:

1. Be civil

Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.

Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.

Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.

Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.

Posts must be news from the most recent 30 days.


6. All posts must be news articles.

No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.

If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.

Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.

The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body

For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

Community stats

  • 14K

    Monthly active users

  • 20K

    Posts

  • 524K

    Comments