This is the best summary I could come up with:
“As a child psychiatrist, Tatum knew the damaging, long-lasting impact sexual exploitation has on the wellbeing of victimized children,” said US Attorney Dena J.
The trial evidence cited by the government includes a secretly-made recording of a minor (a cousin) undressing and showering, and other videos of children participating in sex acts.
“Additionally, trial evidence also established that Tatum used AI to digitally alter clothed images of minors making them sexually explicit,” prosecutors said.
“Specifically, trial evidence showed that Tatum used a web-based artificial intelligence application to alter images of clothed minors into child pornography.”
In prepared remarks [PDF] delivered at a US Senate subcommittee hearing earlier this year, OpenAI CEO Sam Altman said, "GPT-4 is 82 percent less likely to respond to requests for disallowed content compared to GPT-3.5, and we use a robust combination of human and automated review processes to monitor for misuse.
A recent report from investigative organization Balkan Insight says groups like Thorn have been supporting CSAM detection legislation to make online content scanning compulsory in part because they provide that service.
The original article contains 457 words, the summary contains 177 words. Saved 61%. I’m a bot and I’m open source!
Would be good if we could use this for porn. It’s better than anyone actually being in porn. If AI takes over then less people would be trafficked and in the porn industry
What would help with sex trafficking is legalizing and regulating prostitution and destigmatizing it.
Regulating prostitution?
Because you think it is a normal, thought life style? A girl having a normal education (school, not abused during her youth, abandoned, fled war/guerrilla), eating to her fill would choose this path?!
So, you’re okay with people making porn with your image? What about deepfakes of you committing rape? Or participating in CSAM?
Pretty sure the actual CP played a bigger role in the sentencing
Yeah this is a scary, clickbaity headline meant to invoke a negative response about AI. AI is a tool, like a computer or pipe wrench. It can be used for good or bad.
The sicko got what he deserved, but the AI bit is rather over-the-top.
The part that freaks me out is more that he was in an influential position in children’s lives and he was making images of the specific children who were his patients.
This is unfortunately not that uncommon. Pedos often work in child focused jobs. Very disturbing, and that’s why background checks are important in those fields.
How come you are already using a short form, how often do you talk about this kind of thing???
CP got reworded to CSAM (child sexual abuse material) btw
afaik its because „porn“ is thought to imply consent
He used a web based stable diffusion to generate CP. Absolute genius level move 😂
People in general really. Some of the stuff your average person does on the internet and their devices absolutely stumps me, and I’m not even that tech savvy.
Ease of use, plus people anticipate that there will be much more “noise” drowning their activities out in the daily torrent of information. Then back to your point again, people are dumb and forget it’s relatively easy to lookout for certain things even with enormous data flow.
Things to never say before committing a crime:
“Wait, let me sign in with my Google account first.”
Our legal system basically relies on the fact that criminals are that stupid.
Same for our political system.
Healthcare should be built for assuming everybody is an idiot, but instead you need a degree in medical codes invoicing
People have been caught making bomb threats using their own phones. Some people are just not bright enough to be able to survive in this world.
It’s for people like them that you have to have signs everywhere that say things like, don’t stick your hand in the crushy grindy place, this hot water is hot, and don’t drink bleach you’ll die.
I look forward to the inevitable news story about an inmate crushing his skull with an exercise weight.
In this case there are several crimes, but in the other one mentioned about a korean there is nothing, only possession of generated content arguing that there is high realism (someone could say the same even of a sketch). To imprison for acts that have neither victims nor any harm either directly or indirectly, is more aberrant than possessing that.
PS: I’m just talking about legality and rights, I know it’s controversial and I’m sure someone has something to argue against it, but if you’re going to accuse me of being a pedo just get lost you moron.
Criminalizing the creation, possession, or viewing of entirely artificial artwork is beyond unethical; it’s extraordinarily evil.
No it isn’t.
I don’t care if you find someone’s artwork gross, troubling, distasteful, immoral, etc… that’s art.
No, it’s child porn.
Careful, any time I point this out, the fascists come out of the woodwork to call me a pedo.
Can’t imagine why.
You realise the AI is being trained on pictures of real children, right?
So it’s wrong for it to be based on one child, but according to you the AI “art” (as you keep calling it) is okay as long as there are thousands of victims instead?
So you’re cool with images of 6 year olds being penetrated by a 40 year old as long as “tHe Ai DrEw iT sO nObOdY gOt HuRt”? I guess you could just set it as your desktop and phone wallpaper and everything would be fine. Let me know how that works out for you.
That’s some stunning mental gymnastics right there.
You realise the AI is being trained on pictures of real children, right?
Can you share a source? Just like how people utilize the internet to distribute CP, there are undoubtedly circles where people are using ml for CP. However my understanding is that by and large, popular models are not intentionally trained on any.
I know you know this, but you are not crazy. I’m astonished you are being down voted so hard. The pedo apology is so strong it’s making me not want to use Lemmy. This thread is worse than reddit.
Terrifying.
Is it really fascists doing that? Literal fascists? I don’t meet many of them in my daily life.
Here’s a piece of art by Balthus. It’s of a young girl in a skirt, leg hiked up and you can see her underpants: https://www.wikiart.org/en/balthus/thérèse-dreaming-1938
This piece controversial, but evocative, thought-provoking and says something about an innocent time in our youth and the change of demeanor sexuality brings when we become aware.
People may not like this, but if you can separate sexuality and understand that we were once “innocent” - meaning sex wasn’t something we knew about, we just had these bodies we were told to hide in clothes, the painting takes on a whole new meaning.
I’m not advocating for fake cheese pizza photos, fuck those sickos, but art can appear to be one thing on first glance and then take on a new meaning as we study and understand more.
Dont they often train the program with adult porn and then the ai just puts a childs face onto bodies generated from this training data? I imagine these ai companies are scraping data from popular pornsites or just paying for the data and these pornsites work hard not to have CP on them. The result is a childs face on a body too mature for it. Remember that some actual adult actresses have body proportions that many would consider underdeveloped and someone generating these pictures could regenerate till the ai uses these body proportions.
The point being is you don’t need CP to train ai to make CP. I am not justifying any moral positions here, but pointing out a fact in ai technology.
People getting way overexcited about AI at the moment. If a crime or perceived crime even remotely is related to AI it becomes the main focus.
Like the person who was hit by a self-driving car, the case was really about a hit and run drive that it hit the pedestrian first and throwing them into the self-driving car. Have the self-driving car not been there and it had been a human driver pretty much the same thing would have happened but they focus on the AI aspect.
If I used an AI to commit fraud it was me that committed the fraud not the AI but you can be damn sight certain that people would get hung up on that aspect of the case and not the me committing a crime bit.
It’s the same as when Ford invented the transit van (I have no idea what the equivalent in the US market was). It was faster than most cars at the time, could carry heavier loads, and was physically larger. Inevitably it got used in a lot of bank robberies because the police literally couldn’t keep up with it. And people started talking about maybe having a performance limit on vehicles, when really the actual solution was that everyone else just needed better cars. If they had actually implemented a performance limit, they would have held us back.
What exactly is your point about the CSAM AI Models by saying any of that?..
Attempting to normalize and destigmatize representations of child sexual abuse by calling it art is extraordinarily evil.
Like Siesta by Arthur Berzinsh? It’s childsimilar cherubs playing with actions extremely close to eproctophilia with an adult woman
Your first passage about criminalizing art is 100% correct and 100% irrelevant. You cannot call porn art. Porn with adults, children, dogs, pumpkins - all that stuff is made for people to get off, not enjoy the emotions that real art provokes in people. Therefore we cannot compare criminalizing porn with criminalizing art.
There are edge cases, of course, when art might be provocative and considered immoral, and maybe even illegal sometimes. But that would be edge cases, highly debated.
If you have AI pornography of children, regardless of there being no real victim- you’re a fucking pedo.
Period. End of argument.
Get help.
I agree with what you are saying.
however, I think psychologists might not be a fan of giving them access to that material. I think the reason is because they would end up looking fore more and more extreme material and they could end up offending as a result of that
Afaik we’re still yet to find out whether viewing AI-generated material makes an individual look for real-life child abuse imagery.
I believe viewing the latter allows many to keep real-life urges under control (might re-check materials on that), but it obviously comes with its own issues. If we can make AI generated child pornography, and if it doesn’t make people go look for the “real stuff”, we might actually make a very positive impact on child safety.
This is a bad take because it would generate a drive for larger databases to train against. This will not make the problem better.
According to the few studies we have in the nineties and aughts most people who have sexual attractions to kids are aware acting on them can be harmful and will turn to alternative ways to explore them (when they can’t be suppressed or redirected.) So yes, now we have victimless ways to produce porn, the objections are to the fetishes themselves, not to resulting violent behavior.
That said people commonly and openly express their distaste for such people, more so than domestic violence offenders who assault their kids, just not sexually. The same general disdain for child-sex-attracted adults does not transfer to action to feed children or protect them from abusive households.
That said, when we’ve worried about fiction driving people to act it out in reality, historically this has been demonstrated wrong every single time. Women are not driven to scandal and harlot behavior from trashy romance. Teens are not driven to violence from violent movies or video games. We can expect porn featuring childreb is not going to compell someone to seek to actually sex-assault kids.