Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”

252 points

This was just a matter of time - and there isn’t really that much the affected can do (and in some cases, should do). Shutting down that service is the correct thing - but that’ll only buy a short amount of time: Training custom models is trivial nowadays, and both the skill and hardware to do so is in reach of the age group in question.

So in the long term we’ll see that shift to images generated at home, by kids often too young to be prosecuted - and you won’t be able to stop that unless you start outlawing most of AI image generation tools.

At least in Germany the dealing with child/youth pornography got badly botched by incompetent populists in the government - which would send any of those parents to jail for at least a year, if they take possession of one of those generated pictures. Having it sent to their phone and going to police for a complaint would be sufficient to get prosecution against them started.

There’s one blessing coming out of that mess, though: For girls who did take pictures, and had them leaked, saying “they’re AI generated” is becoming a plausible way out.

permalink
report
reply
127 points

There’s one blessing coming out of that mess, though: For girls who did take pictures, and had them leaked, saying “they’re AI generated” is becoming a plausible way out.

Indeed, once the AI gets good enough, the value of pictures and videos will plummet to zero.

Ironically, in a sense we will revert back to the era before photography existed. To verify if something is real, we might have to rely on witness testimony.

permalink
report
parent
reply
57 points

Politics is about to get WILD

permalink
report
parent
reply
19 points
*

Dwayne Elizondo Mountain Dew Herbert Camacho approves!

Shit’s going to get real emotional

permalink
report
parent
reply
37 points

To verify if something is real, we might have to rely on witness testimony.

This is not going to work. Just because images and videos become less reliable that doesn’t mean we will forget about the fact that eyewitness testimony is very unreliable.

permalink
report
parent
reply
26 points

You say “forget” like it’s not still incredibly common as evidence.

There’s lots of data showing that eyewitnesses aren’t reliable but that doesn’t mean courts actually stopped relying on it. Ai making another form of evidence untrustworthy will result in eyewitnesses taking its place.

permalink
report
parent
reply
30 points

Indeed, once the AI gets good enough, the value of pictures and videos will plummet to zero.

This just isn’t true. They will still be used to sexualise people, mostly girls and women, against their consent. It’s no different from AI-generated child pornography. It does harm even if no ‘real’ people appear in the images.

Fucking horrible world we’re forced to live in. Where’s the fucking exit?

permalink
report
parent
reply
13 points

It is different than AI-generated CSAM because real people are actually being harmed by these deepfake images.

permalink
report
parent
reply
0 points

Sauce that allowing computer generated cp causes more harm?

permalink
report
parent
reply
18 points

A bit off topic, but I wonder if the entertainment industry as a whole is going to be completely destroyed by AI when it gets good enough.

I can totally see myself prompting “a movie about love in the style of Star Wars, with Ryan Gosling and Audrey Hepburn as the leads, directed by Alfred Hitchcock, written by Vincent Hugo.” And then what? It’s game over for any content creation.

Curious if I’ll see that kind of power at home (using open source tools) in my lifetime.

permalink
report
parent
reply
9 points

I envisage a world where your browsing Netflix, and based on past preferences some of the title cards are generated on the fly for you. Then based on what you click, the AI engine warms us and generates the film for you in real-time. Essentially indistinguishable from the majority of Hollywood regurgitation.

And because the script is just a series of autogenerated prompts, its like a choose your own adventure book, you can steer the narrative the way you want if you elect to. Otherwise it’ll be good enough to keep most monkey brains happy and you won’t even be able to tell the difference most of the time.

permalink
report
parent
reply
6 points

I know it’s impossible to perfectly predict future technology, but I believe AI will exist alongside traditional filmmaking. You’ll NEVER get something with the emotional impact of Up or Schindler’s List from an AI. You’ll be able to make fun action or fantasy movies though, and like you said, fully customized for the viewer. I imagine it’ll be like CGI vs traditional animation now - you only see the latter for passion projects, but for most uses, CGI works well enough.

permalink
report
parent
reply
2 points

This is already starting to happen for digital illustration. With better models and enough images saved, you can already train a model to replicate the art created by an artist.

permalink
report
parent
reply
10 points

Holy shit, I never thought of the whole witness testimony aspect. For some reason my mind was just like “well, nothing we see in videos or pictures is real anymore, guess everyone is just gonna devolve into believing whatever confirms their bias and argue endlessly about which pictures are fake and which are real.”

Witness testimony and live political interactions are going to become incredibly important for how our society views “the truth” in world events in the near future. I don’t know if I love or hate that.

permalink
report
parent
reply
6 points

Not necessarily, solutions can implemented. For example, footage from private security cameras can be sent to trusted establishment (trusted by the court at least) in real time which can be timestamped and stored (maybe not necessarily even stored there, encryption with timestamp may be enough). If source private camera and the network is secure, footage is also secure.

permalink
report
parent
reply
9 points
*
Deleted by creator
permalink
report
parent
reply
3 points

Network security is a pretty big ask though - just look at how many unsecured cameras are around now. And once an attacker is in anything generated on that network becomes suspect - how do you know the security camera feed wasn’t intercepted, manipulated, or replaced altogether?

permalink
report
parent
reply
2 points

To verify if something is real, we might have to rely on witness testimony flagrancy.

FTFY. Witness has never been that good a means to verify something is real.

permalink
report
parent
reply
2 points

Maybe there will be cameras as well that sign the pictures they take?

permalink
report
parent
reply
-21 points

Thats why we need Blockchain Technology…

Check Blockchain Camera for example: https://github.com/sv1sjp/Blockchain_Camera

Abstract:


Blockchain Camera provides an easy and safe way to capture and guarantee the existence of videos reducing the impact of modified videos as it can preserve the integrity and validity of videos using Blockchain Technology. Blockchain Camera sends to Ethereum Network the hash of each video and the time the video has been recorded in order to be able validate that a video is genuine and hasn't been modified using a Blockchain Camera Validation Tool.
permalink
report
parent
reply
16 points

How exactly does that prevent someone from uploading a fake video?

permalink
report
parent
reply
7 points
*

How is that better than an immutable database where you guarantee trust simply by gettin your own public hash receipt for the database every time you introduce a new item? Why obfuscate things by riding the “Blockchain” hype bandwagon?

permalink
report
parent
reply
18 points

Same goes for any deepfake. People are loosing their shit because we won’t know what’s real and what’s not!.

We should have been teaching critical thinking a generation ago. Sagan was pleading for reform in the 90s. We can start teaching the next generation how to navigate the Information Age. What we can’t do is make the world childproof.

permalink
report
parent
reply
10 points

Yeah, what I see happening is people end up not caring as much because there’s going to be so much plausible AI generated crap that any real stuff will be lost in the noise.

permalink
report
parent
reply
3 points

Quelle für das angesprochene Gesetz bitte. Das will ich im Detail lesen.

permalink
report
parent
reply
8 points

Fang mit dem relativ neuen Fall hier an, und von da solltest du dann genug Info haben um selber zu suchen was die letzten Jahre passiert ist - das ist exakt das wovor damals gewarnt wurde, aber wer den hysterischen Irren die alles was irgendwie mit “Teenager entdecken Sexualitaet” mit dem Strafrecht erschlagen wollen mit durchdachten Argumenten kommt ist dann ja direkt auch ein Paedophiler.

https://www.swr.de/swraktuell/rheinland-pfalz/koblenz/lehrerin-kinderpornografischer-inhalte-konfisziert-deswegen-angeklagt-100.html

permalink
report
parent
reply
113 points

At least now you can claim it’s AI if your real nudes leak

permalink
report
reply
81 points

In the long term that might even lead to society stopping their freak-outs every time someone in some semi-sensitive position is discovered to have nude pictures online.

permalink
report
parent
reply
27 points

I hope so. We shouldn’t be ashamed of our bodies or sexuality.

permalink
report
parent
reply
64 points
*

Interesting. Replika AI, ChatGPT etc crack down on me for doing erotic stories and roleplay text dialogues. And this Clothoff App happily draws child pornography of 14 year olds? Shaking my head…

I wonder why they have no address etc on their website and the app isn’t available in any of the proper app-stores.

Obviously police should ask Instagram who blackmails all these girls… Teach them a proper lesson. And then stop this company. Have them fined a few millions for generating and spreading synthetic CP. At least write a letter to their hosting or payment providers.

permalink
report
reply
4 points

Fined? Fuck that. CP must result in jail time.

permalink
report
parent
reply
2 points
*

I just hope they even try to catch these people. I’ve tried to look up who’s behind that and it’s a domain that’s with name.com and the server is behind Cloudflare. I’m not Anonymous, so that’s the point at which I’m at my wits’ end. Someone enraged could file a few reports at their abuse contacts… Just sayin…

There’s always the possibility they just catch the boy and just punish him. Letting the even more disgusting people in the background keep doing what they want. Because it would be difficult to get a hold of them. This would be the easiest route for the prosecuters and the least efficient way to deal with this issue as a whole.

permalink
report
parent
reply
-2 points

Prison at the very least and all the inmates need to know that you engaged in CP.

permalink
report
parent
reply
2 points

I thought some kids did this?

permalink
report
parent
reply
1 point
*

I didn’t follow how the story turned out that closely. I think it was a schoolmate who did this. I kinda split up my answer because I think if a kid/minor is the offender, it’s not yet too late to learn how to behave (hopefully). But blackmailing people with nudes is a bit more than the usual bullying and occasional fight between boys we did back in the day. I trust some judge has a look at the individual case and comes up with a proper punishment that factors this in.

What annoys me is the people who offer this service. Advertise for use-cases like this and probably deliberately didn’t put any filters in place not even if it’s pictures of minors. I think they should be charged, fined and ultimately that business case should be banned. I (anonymously) filed a complaint, after writing that comment in September. But they’re still online as of today.

So in my opinion the kid should be taught a lesson and the company should pay for this and be closed for good.

permalink
report
parent
reply
40 points

Yes, lets name the tool in the article so everybody can participate in the abuse

permalink
report
reply
30 points

I doubt it will do much of anything not to name it.

permalink
report
parent
reply
11 points

Considering that AI services typically cost money, especially those advertising adult themes, it kinda does do support the hosters of such services.

permalink
report
parent
reply
11 points

Then again, naming and shaming puts pressure on them too. But in the end I doubt it matters. Those who want to use them will find them.

permalink
report
parent
reply
6 points

You can literally Google ‘AI nude generation tool’ and get multiple results already. And I do sort of agree with you as I’m not sure how naming this specific tool was necessary or beneficial here. But I don’t think not naming it is going to prevent anyone interested in such a tool from finding one. The software/tool itself is (currently) not illegal.

permalink
report
parent
reply
38 points

The shock value of a nude picture will become increasingly humdrum as they become more widespread. Nudes will become so common that no one will batt an eye. In fact, some less endowed, less perfect ladies will no doubt do AI generated pictures or movies of themselves to sell on the internet. Think of it as photoshop X 10.

permalink
report
reply
51 points

This isn’t about nude photos, it’s about consent.

permalink
report
parent
reply
52 points

I can already get a canvas and brush and draw what I think u/DessertStorms looks like naked and there is nothing you can do about it.

permalink
report
parent
reply
-5 points

The lack of empathy in your response is telling. People do not care for the effect this has on teenage girls. They don’t even try to be compassionate. I think this will just become the next thing girls and women will simply have to accept as part of their life and the sexism and objectification that is targeted at them. But “boys will be boys” right?

permalink
report
parent
reply
-13 points

You’re not making the point you think you are, instead you’re just outing yourself as a creep. ¯_(ツ)_/¯

permalink
report
parent
reply
34 points

Photoshopped nude pictures of celebrities (and people the photoshopper knew personally) have been around for at least 30 years at this point. This is not a new issue as far as the legal situation is concerned, just the ease of doing it changed a bit.

permalink
report
parent
reply
-24 points

Have you ever posted a photo on Facebook or Instagram?

If the answer is yes, congratulations! You gave consent.

permalink
report
parent
reply
8 points

Please show me where exactly the terms and conditions mention the production and publication of ai generated nudes on those sites.

Also eww, I would not want to be near you in real life.

permalink
report
parent
reply
2 points
*

Jesus y’all really eat stones for breakfast everyday around these parts

permalink
report
parent
reply
7 points

The article is about children.

permalink
report
parent
reply
2 points

The age of the victims is not really relevant. The problem would remain if the article were about adults.

permalink
report
parent
reply
3 points

The problem is very different here because they are children.

permalink
report
parent
reply
2 points

People already do this on dating apps with filters

permalink
report
parent
reply

Europe

!europe@feddit.de

Create post

News/Interesting Stories/Beautiful Pictures from Europe 🇪🇺

(Current banner: Thunder mountain, Germany, 🇩🇪 ) Feel free to post submissions for banner pictures

Rules

(This list is obviously incomplete, but it will get expanded when necessary)

  1. Be nice to each other (e.g. No direct insults against each other);
  2. No racism, antisemitism, dehumanisation of minorities or glorification of National Socialism allowed;
  3. No posts linking to mis-information funded by foreign states or billionaires.

Also check out !yurop@lemm.ee

Community stats

  • 2

    Monthly active users

  • 3.2K

    Posts

  • 34K

    Comments

Community moderators