Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.

While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.

167 points

Seen similar stuff on TikTok.

That’s the big problem with ad marketplaces and automation, the ads are rarely vetted by a human, you can just give them money, upload your ad and they’ll happily display it. They rely entirely on users to report them which most people don’t do because they’re ads and they wont take it down unless it’s really bad.

permalink
report
reply
56 points
*

It’s especially bad on reels/shorts for pretty much all platforms. Tons of financial scams looking to steal personal info or worse. And I had one on a Facebook reel that was for boner pills that was legit a minute long ad of hardcore porn. Not just nudity but straight up uncensored fucking.

permalink
report
parent
reply
11 points

The user reports are reviewed by the same model that screened the ad up-front so it does jack shit

permalink
report
parent
reply
16 points

Actually, a good 99% of my reports end up in the video being taken down. Whether it’s because of mass reports or whether they actually review it is unclear.

What’s weird is the algorithm still seems to register that as engagement, so lately I’ve been reporting 20+ videos a day because it keeps showing them to me on my FYP. It’s wild.

permalink
report
parent
reply
16 points

That’s a clever way of getting people to work for them as moderators.

permalink
report
parent
reply
-18 points
*

Okay this is going to be one of the amazingly good uses of the newer multimodal AI, it’ll be able to watch every submission and categorize them with a lot more nuance than older classifier systems.

We’re probably only a couple years away from seeing a system like that in production in most social media companies.

permalink
report
parent
reply
19 points

Nice pipe dream, but the current fundamental model of AI is not and cannot be made deterministic. Until that fundamental chamge is developed, it isnt possible.

permalink
report
parent
reply
13 points

the current fundamental model of AI is not and cannot be made deterministic.

I have to constantly remind people about this very simple fact of AI modeling right now. Keep up the good work!

permalink
report
parent
reply
-3 points
*

What do you mean? AI absolutely can be made deterministic. Do you have a source to back up your claim?

You know what’s not deterministic? Human content reviewers.

Besides, determinism isn’t important for utility. Even if AI classified an ad wrong 5% of the time, it’d still massively clean up the spammy advertisers. But they’re far, FAR more accurate than that.

permalink
report
parent
reply
119 points

It’s all so incredibly gross. Using “AI” to undress someone you know is extremely fucked up. Please don’t do that.

permalink
report
reply
51 points

I’m going to undress Nobody. And give them sexy tentacles.

permalink
report
parent
reply
8 points

Behold my meaty, majestic tentacles. This better not awaken anything in me…

permalink
report
parent
reply
9 points
*

Same vein as “you should not mentally undress the girl you fancy”. It’s just a support for that. Not that i have used it.

Don’t just upload someone else’s image without consent, though. That’s even illegal in most of europe.

permalink
report
parent
reply
16 points

Why you should not mentally undress the girl you fancy (or not, what difference does it make?)? Where is the harm of it?

permalink
report
parent
reply
12 points

Where is the harm of it

there is none, that’s their point

permalink
report
parent
reply

Would it be any different if you learn how to sketch or photoshop and do it yourself?

permalink
report
parent
reply
38 points

You say that as if photoshopping someone naked isnt fucking creepy as well.

permalink
report
parent
reply
-2 points

Creepy, maybe, but tons of people have done it. As long as they don’t share it, no harm is done.

permalink
report
parent
reply

Creepy to you, sure. But let me add this:

Should it be illegal? No, and good luck enforcing that.

permalink
report
parent
reply
17 points

Yes, because the AI (if not local) will probably store the images on their Servers

permalink
report
parent
reply
7 points

good point

permalink
report
parent
reply

This is the only good answer.

permalink
report
parent
reply
17 points
*

This is also fucking creepy. Don’t do this.

permalink
report
parent
reply
3 points

But yet, we’ve (almost) all imagined someone we know naked.

permalink
report
parent
reply

I am not saying anyone should do it and don’t need some internet stranger to police me thankyouverymuch.

permalink
report
parent
reply
2 points
Deleted by creator
permalink
report
parent
reply
-36 points

Can you articulate why, if it is for private consumption?

permalink
report
parent
reply
58 points

Consent.

You might be fine with having erotic materials made of your likeness, and maybe even of your partners, parents, and children. But shouldn’t they have right not to be objectified as wank material?

I partly agree with you though, it’s interesting that making an image is so much more troubling than having a fantasy of them. My thinking is that it is external, real, and thus more permanent even if it wouldn’t be saved, lost, hacked, sold, used for defamation and/or just shared.

permalink
report
parent
reply
21 points

To add to this:

Imagine someone would sneak into your home and steal your shoes, socks and underwear just to get off on that or give it to someone who does.

Wouldn’t that feel wrong? Wouldn’t you feel violated? It’s the same with such AI porn tools. You serve to satisfy the sexual desires of someone else and you are given no choice. Whether you want it or not, you are becoming part of their act. Becoming an unwilling participant in such a way can feel similarly violating.

They are painting and using a picture of you, which is not as you would like to represent yourself. You don’t have control over this and thus, feel violated.

This reminds me of that fetish, where one person is basically acting like a submissive pet and gets treated like one by their “master”. They get aroused by doing that in public, one walking with the other on a leash like a dog on hands and knees. People around them become passive participants of that spectactle. And those often feel violated. Becoming unwillingly, unasked a participant, either active or passive, in the sexual act of someone else and having no or not much control over it, feels wrong and violating for a lot of people.
In principle that even shares some similarities to rape.

There are countries where you can’t just take pictures of someone without asking them beforehand. Also there are certain rules on how such a picture can be used. Those countries acknowledge and protect the individual’s right to their image.

permalink
report
parent
reply
-2 points

it is external, real, and thus more permanent

Though just like your thoughts, the AI is imagining the nude parts aswell because it doesn’t actually know what they look like. So it’s not actually a nude picture of the person. It’s that person’s face on a entirely fictional body.

permalink
report
parent
reply
29 points

An exfriend of mine Photoshopped nudes of another friend. For private consumption. But then someone found that folder. And suddenly someones has to live with the thought that these nudes, created without their consent, were used as spank bank material. Its pretty gross and it ended the friendship between the two.

permalink
report
parent
reply
16 points
*

You can still be wank material with just your Facebook pictures.

Nobody can stop anybody from wanking on your images, AI or not.

Related Louis CK

permalink
report
parent
reply
17 points

If you have to ask, you’re already pretty skeevy.

permalink
report
parent
reply

And if you have to say that, you’re already sounding like some judgy jerk.

permalink
report
parent
reply
0 points

The fact that you do not even ask such questions, shows that you are narrow minded. Such mentality leads to people thinking that “homosexuality is bad” and never even try to ask why, and never having chance of changing their mind.

permalink
report
parent
reply
16 points
*

It’s creepy and can lead to obsession, which can lead to actual harm for the individual.

I don’t think it should be illegal, but it is creepy and you shouldn’t do it. Also, sharing those AI images/videos could be illegal, depending on how they’re represented (e.g. it could constitute libel or fraud).

permalink
report
parent
reply
13 points

I disagree. I think it should be illegal. (And stay that way in countries where it’s already illegal.) For several reasons. For example, you should have control over what happens with your images. Also, it feels violating to become unwillingly and unasked part of the sexual act of someone else.

permalink
report
parent
reply
-7 points
*

Would you like if someone were to make and wank to these pictures of your kids, wife or parents ? The fact that you have to ask speaks much about you tho.

permalink
report
parent
reply
9 points

There are plenty of things I might not like that aren’t illegal.

I’m interested in thr thought experiment this has brought up, but I don’t want us to get caught in a reactionary fervor because of AI.

AI will make this easier to do, but people have been clipping magazines and celebrities have had photoshops fakes created since both mediums existed. This isn’t new, but it is being commoditized.

My take is that these pictures shouldn’t be illegal to own or create, but they should be illegal to profit off of and distribute, meaning these tools specifically designed and marketed for it would be banned. If someone wants to tinker at home with their computer, yoipl never be able to ban that, and you’ll never be able to ban sexual fantasy.

permalink
report
parent
reply
4 points
*

The fact that people don’t realize how these things can be used for bad and weaponized is insane. I mean, it shows they clearly are not part of the vulnerable group of people and their privilege of never having dealt with it.

The future is amazing! Everyone with apps going to the parks and making some kids nude. Or bullying which totally doesn’t happen in fucked up ways with all the power of the internet already.

permalink
report
parent
reply
106 points
*

Yet another example of multi billion dollar companies that don’t curate their content because it’s too hard and expensive. Well too bad maybe you only profit 46 billion instead of 55 billion. Boo hoo.

permalink
report
reply
68 points

It’s not that it’s too expensive, it’s that they don’t care. They won’t do the right thing until and unless they are forced to, or it affects their bottom line.

permalink
report
parent
reply
29 points

Wild that since the rise of the internet it’s like they decided advertising laws don’t apply anymore.

But Copyright though, it absolutely does, always and everywhere.

permalink
report
parent
reply
6 points
*

An economic entity cannot care, I don’t understand how people expect them to. They are not human

permalink
report
parent
reply
5 points

Economic Entities aren’t robots, they’re collections of people engaged in the act of production, marketing, and distribution. If this ad/product exists, its because people made it exist deliberately.

permalink
report
parent
reply
18 points

Your example is 9 billion difference. This would not cost 9 billion. It wouldn’t even cost 1 billion.

permalink
report
parent
reply
6 points

Yeah realistically you’re talking about a team of 10 to 30 people whose entire job is to give the final thumbs up or thumbs down to an ad.

You’re talking one to three million dollars a year, maybe throw an extra million on for the VP.

Chump change, they just don’t want to pay it cuz nobody’s forcing them to

permalink
report
parent
reply
4 points

It would take more than 10-30 to run a content review department for any of the major social media firms, but your point still stands that it wouldn’t be a billion annually. A few 10s of millions between wages/benefits/equipment/software all combined annually

permalink
report
parent
reply
13 points

Shouldn’t AI be good at detecting and flagging ads like these?

permalink
report
parent
reply
8 points

“Shouldn’t AI be good” nah.

permalink
report
parent
reply
5 points

Build an AI that will flag immoral ads and potentially lose you revenue

Build an AI to say you’re using AI to moderate ads but it somehow misses the most profitable bad actors

Which do you think Meta is doing?

permalink
report
parent
reply
4 points

Well too bad maybe you only profit 46 billion instead of 55 billion.

I can’t possibly imagine this quality of clickbait is bringing in $9B annually.

Maybe I’m wrong. But this feels like the sort of thing a business does when its trying to juice the same lemon for the fourth or fifth time.

permalink
report
parent
reply
11 points

It’s not that the clickbait is bringing in $9B, it’s that it would cost $9B to moderate it.

permalink
report
parent
reply
72 points

Intersting how we can “undress any girl” but I have not seen a tool to “undress any boy” yet. 😐

I don’t know what it says about people developing those tools. (I know, in fact)

permalink
report
reply
58 points

Make one :P

Then I suspect you’ll find the answer is money. The ones for women simply just make more money.

permalink
report
parent
reply
1 point

This

permalink
report
parent
reply
31 points

I’ve seen a tool like that. Everyone was a bodybuilder and Hung like a horse.

permalink
report
parent
reply
35 points

I’m going to guess all the ones of women have bolt on tiddies and no pubic hair.

permalink
report
parent
reply
9 points

Well of course. Sagging breasts are gross /s

permalink
report
parent
reply
32 points

Thats what we all look like.

Don’t check though

permalink
report
parent
reply
9 points

Gotta wonder where they get their horse dick training images from

permalink
report
parent
reply
7 points

notices ur instance

Can’t judge though, I have a Chance myself lawl

permalink
report
parent
reply
6 points
*

> yiffit.net

> horse dick joke

oh lawd

permalink
report
parent
reply
5 points

Where does Bad Dragon get them from?

permalink
report
parent
reply
19 points
*

Be the change you wish to see in the world

\s

permalink
report
parent
reply
18 points

You probably don’t need them. You can get these photos without even trying. Is a bit of a problem really.

permalink
report
parent
reply
9 points

You probably can with the same inpainting stable diffusion tools, it’s just not as heavily advertised.

permalink
report
parent
reply
8 points

Probably because the demand is not big or visible enough to make the development worth it, yet.

permalink
report
parent
reply
3 points
*

You don’t need to make fake nudes of guys - you can just ask. Hell, they’ll probably send you one even without asking.

In general women aren’t interested in that anyway and gay guys don’t need it because, again, you can just ask.

permalink
report
parent
reply
7 points
*

This is simultaneously a gross generalization and strangely accurate and I hate you for it

permalink
report
parent
reply
-6 points

What it says is that there isn’t any demand for seeing boys naked. We simply look worse because we fucked up evolution.

permalink
report
parent
reply
64 points

It remains fascinating to me how these apps are being responded to in society. I’d assume part of the point of seeing someone naked is to know what their bits look like, while these just extrapolate with averages (and likely, averages of glamor models). So we still dont know what these people actually look like naked.

And yet, people are still scorned and offended as if they were.

Technology is breaking our society, albeit in place where our culture was vulnerable to being broken.

permalink
report
reply
101 points
*

And yet, people are still scorned and offended as if they were.

I think you are missing the plot here… If a naked pic of yourself, your mother, your wife, your daughter is circulating around the campus, work or just online… Are you really going to be like “lol my nipples are lighter and they don’t know” ??

You may not get that job, promotion, entry into program, etc. The harm done by naked pics in public would just as real weather the representation is accurate or not … And that’s not even starting to talk about the violation of privacy and overall creepiness of whatever people will do with that pic of your daughter out there

permalink
report
parent
reply
42 points

I believe their point is that an employer logically shouldn’t care if some third party fabricates an image resembling you. We still have an issue with latent puritanism, and this needs to be addressed as we face the reality of more and more convincing fakes of images, audio, and video.

permalink
report
parent
reply
21 points

I agree… however, we live in the world we live in, where employers do discriminate as much as they can before getting in trouble with the law

permalink
report
parent
reply
20 points

This is a transitional period issue. In a couple of years you can just say AI made it even if it’s a real picture and everyone will believe you. Fake nudes are in no way a new thing anyway. I used to make dozens of these by request back in my edgy 4chan times 15 years ago.

permalink
report
parent
reply
19 points
*

Dead internet.

It also means in a few years, any terrible thing someone does will just be excused as a “deep fake” if you have the resources and any terrible thing someone wants to pin on you with be cooked up in seconds. People wont just blanket believe or disbelieve any possible deep fake. They’ll cherry pick what to believe based on their preexisting world view and how confident the story telling comes across.

As far as your old edits go, if they’re anything like the ones I saw, they were terrible and not believable at all.

permalink
report
parent
reply
19 points

This is a transitional period issue. In a couple of years you can just say AI made it even if it’s a real picture and everyone will believe you.

Sure, but the question of whether they harm the victim is still real… if your prospective employer finds tons of pics of you with nazi flags, guns and drugs… they may just “play it safe” and pass on you… no matter how much you claim (or even the employer might think) they are fakes

permalink
report
parent
reply
10 points

“Fools! Everybody at my school is laughing at me for having a 2-incher, but little do they know it actually curves to the LEFT!”

permalink
report
parent
reply
2 points

Lol

permalink
report
parent
reply
1 point

Mine curves to the left.

permalink
report
parent
reply
3 points

On the other end, I welcome the widespread creation of these of EVERYONE, so that it becomes impossible for them to be believable. No one should be refused from a job/promotion because of the existence of a real one IMO and this will give plausible deniability.

permalink
report
parent
reply
25 points

People are refused for jobs/promotions on the most arbitrary basis, often against existing laws but they are impossible to enforce.

Even if it is normalized, there is always the escalation factor… sure, nobody won’t hire Anita because of her nudes out there, everyone has them and they are probably fake right?.. but Perdita? hmmm I don’t want my law firm associated in any way with her pics of tentacle porn, that’s just too much!

Making sure we are all in the gutter is not really a good way to deal with this issue… specially since it will, once again, impact women 100x more than it will affect men

permalink
report
parent
reply
4 points

Post your clothed full body pic

permalink
report
parent
reply
0 points

Yeah, let’s just sexually violate everyone. /s

Who the hell is upvoting this awful take? Please understand that it would never be equitable. If this became reality, it would be women and girls that were exploited the most viciously.

I guess if you don’t give a shit about people, especially women and girls, feeling safe in public at all, you would say something like this…

permalink
report
parent
reply
30 points

Something like this could be career ending for me. Because of the way people react. “Oh did you see Mrs. Bee on the internet?” Would have to change my name and move three towns over or something. That’s not even considering the emotional damage of having people download you. Knowledge that “you” are dehumanized in this way. It almost takes the concept of consent and throws it completely out the window. We all know people have lewd thoughts from time to time, but I think having a metric on that…it would be so twisted for the self-image of the victim. A marketplace for intrusive thoughts where anyone can be commodified. Not even celebrities, just average individuals trying to mind their own business.

permalink
report
parent
reply
12 points

Exactly. I’m not even shy, my boobs have been out plenty and I’ve sent nudes all that. Hell I met my wife with my tits out. But there’s a wild difference between pictures I made and released of my own will in certain contexts and situations vs pictures attempting to approximate my naked body generated without my knowledge or permission because someone had a whim.

permalink
report
parent
reply
5 points

I think this is why it’s going to be interesting to see how we navigate this as a society. So far, we’ve done horribly. It’s been over a century now that we’ve acknowledged sexual harassment in the workplace is a problem that harms workers (and reduces productivity) and yet it remains an issue today (only now we know the human resources department will protect the corporate image and upper management by trying to silence the victims).

What deepfakes and generative AI does is make it easy for a campaign staffer, or an ambitious corporate later climber with a buddy with knowhow, or even a determined grade-school student to create convincing media and publish it on the internet. As I note in the other response, if a teen’s sexts get reported to law enforcement, they’ll gladly turn it into a CSA production and distribution issue and charge the teens themselves with serious felonies with long prison sentences. Now imagine if some kid wanted to make a rival disappear. Heck, imagine the smart kid wanting to exact revenge on a social media bully, now equipped with the power of generative AI.

The thing is, the tech is out of the bag, and as with princes in the mid-east looking at cloned sheep (with deteriorating genetic defects) looking to create a clone of himself as an heir, humankind will use tech in the worst, most heinous possible ways until we find cause to cease doing so. (And no, judicial punishment doesn’t stop anyone). So this is going to change society, whether we decide collectively that sexuality (even kinky sexuality) is not grounds to shame and scorn someone, or that we use media scandals the way Italian monastics and Russian oligarchs use poisons, and scandalize each other like it’s the shootout at O.K. Corral.

permalink
report
parent
reply
3 points

Thanks, I liked this reply. There is a lot of nuance here.

permalink
report
parent
reply
-1 points

I think you might be overreacting, and if you’re not, then it says much more about the society we are currently living in than this particular problem.

I’m not promoting AI fakes, just to be clear. That said, AI is just making fakes easier. If you were a teacher (for example) and you’re so concerned that a student of yours could create this image that would cause you to pick up and move your life, I’m sad to say they can already do this and they’ve been able to for the last 10 years.

I’m not saying it’s good that a fake, or an unsubstantiated rumor of an affair, etc can have such big impacts on our life, but it is troubling that someone like yourself can fear for their livelihood over something so easy for anyone to produce. Are we so fragile? Should we not worry more about why our society is so prudish and ostracizing to basic human sexuality?

permalink
report
parent
reply
8 points
*

None of that is relevant. The issue being discussed here isn’t one of whether or not it’s currently possible to create fake nudes.

The original post being replied to indicated that, since AI, an artist, a photoshopper, whatever, is just creating an imaginary set of genitalia, and they have no ability to know if it’s accurate or not, there is no damage being done. That’s what people are arguing about.

permalink
report
parent
reply
5 points
*

The society we are living in can be handling things incorrectly but it can absolutely have real-world damaging effects. As a collective we should worry about our society, but individuals absolutely are and should be justified in worrying about their lives being damaged by this.

permalink
report
parent
reply
19 points
*

Wtf are you even talking about? People should have the right to control if they are “approximated” as nude. You can wax poetic how it’s not nessecarily correct but that’s because you are ignoring the woman who did not consent to the process. Like, if I posted a nude then that’s on the internet forever. But now, any picture at all can be made nude and posted to the internet forever. You’re entirely removing consent from the equation you ass.

permalink
report
parent
reply
4 points
*

I’m not arguing whether people should or should not have control over whether others can produce a nude (or lewd) likeness or perpetuate false scandal, only that this technology doesn’t change the equation. People have been accused of debauchery and scorned long before the invention of the camera, let alone digital editing.

Julia the Elder was portrayed (poorly, mind you) in sexual congress on Roman graffiti. Marie Antoinette was accused of a number of debauched sexual acts she didn’t fully comprehend. Marie Antoinette actually had an uninteresting sex life. It was accusations of The German Vice (id est lesbianism) that were the most believable and quickened her path to the guillotine.

The movie, The Contender (2000) addresses the issue with happenstance evidence. A woman politician was caught on video inflagrante delicto at a frat party in her college years just as she was about to be appointed as a replacement Vice President.

Law enforcement still regards sexts between underage teens as child porn, and our legal system will gladly incarcerate those teens for the crime of expressing their intimacy to their lovers. (Maine, I believe, is the sole exception, having finally passed laws to let teens use picture messaging to court each other.) So when it comes to the intersection of human sexuality and technology, so far we suck at navigating it.

To be fair, when it comes to human sexuality at all, US society sucks at navigating it. We still don’t discuss consent in grade school. I can’t speak for anywhere else in the world, though I’ve not heard much good news.

The conversation about revenge porn (which has been made illegal without the consent of all participants in the US) appears to inform how society regards explicit content of private citizens. I can’t speak to paparazzi content. Law hasn’t quite caught up with Photoshop, let alone deepfakes and content made with generative AI systems.

But my point was, public life, whether in media, political, athletic or otherwise, is competitive and involves rivalries that get dirty. Again, if we, as a species actually had the capacity for reason, we would be able to choose our cause célèbre with rationality, and not judge someone because some teenager prompted a genAI platform to create a convincing scandalous video.

I think we should be above that, as a society, but we aren’t. My point was that I don’t fully understand the mechanism by which our society holds contempt for others due to circumstances outside their control, a social behavior I find more abhorrent than using tech to create a fictional image of someone in the buff for private use.

Sadly, fictitious explicit media can be as effective as a character assassination tool as the real thing. I think it should be otherwise. I think we should be better than that, but we’re not. I am, consequently frustrated and disappointed with my society and my species. And while I think we’re going to need to be more mature about it, I’ve opined this since high school in the 1980s and things have only gotten worse.

At the same time, it’s like the FGC-9, the tech cannot be contained any than we can stop software piracy with DRM. Nor can we trust the community at large to use it responsibly. So yes, you can expect explicit media of colleagues to fly much the way accusations of child sexual assault flew in the 1990s (often without evidence in middle and upper management. It didn’t matter.) And we may navigate it pretty much the same way, with the same high rate of career casualties.

permalink
report
parent
reply
4 points

Totally get your frustration, but people have been imagining, drawing, and photoshopping people naked since forever. To me the problem is if they try and pass it off as real. If someone can draw photorealistic pieces and drew someone naked, we wouldn’t have the same reaction, right?

permalink
report
parent
reply
6 points

I don’t think you are accounting for ease of use. It took time and skill for an individual to photoshop someone else. This is just an app. It takes more effort to prove the truth, then it does to create a lie. Not to mention, how in the other article it explains that people are using this to bait children. :/

permalink
report
parent
reply
2 points
*

It takes years of pratice to draw photorealism, and days if not weeks to draw a particular piece. Which is absolutely not the same to any jackass with an net connection and 5 minutes to create a equally/more realistic version.

It’s really upsetting that this argument keeps getting brought up. Because while guys are being philosophical about how it’s therotically the same thing, women are experiencing real world harm and harassment from these services. Women get fired for having nudes, girls are being blackmailed and bullied with this shit.

But since it’s theoretically always been possible somehow churning through any woman you find on Instagram isn’t an issue.

Totally get your frustration

Do you? Since you aren’t threatened by this, yet another way for women to be harassed is just a fun little thought experiment.

permalink
report
parent
reply
-4 points

An artist doesn’t need your consent to paint/ draw you. A photographer doesn’t need your consent if your in public. You likely posted your original picture in public (yay facebook). Unfortunately consent was never a concern here… and you likely gave it anyway.

permalink
report
parent
reply
8 points

Are you seriously saying that since I am walking in public I am giving concent to photos taken of me and turned nude?

You’ve lost your damn mind.

permalink
report
parent
reply
13 points

The draw to these apps is that the user can exploit anyone they want. It’s not really about sex, it’s about power.

permalink
report
parent
reply
14 points

Human society is about power. It is because we can’t get past dominance hierarchy that our communities do nothing about schoolyard bullies, or workplace sexual harassment. It is why abstinence-only sex-ed has nothing positive to say to victims of sexual assault, once they make it clear that used goods are used goods.

Our culture agrees by consensus that seeing a woman naked, whether a candid shot, caught inflagrante delicto or rendered from whole cloth by a generative AI system, redefines her as a sexual object, reducing her qualifications as a worker, official or future partner. That’s a lot of power to give to some guy with X-ray Specs, and it speaks poorly of how society regards women, or human beings in general.

We disregard sex workers, too.

Violence sucks, but without the social consensus the propagates sexual victimhood, it would just be violence. Sexual violence is extra awful because the rest of society actively participates in making it extra awful.

permalink
report
parent
reply
-6 points

Dude I can imagine people naked in my head.

Yes I think this ai trend is sad and people who use these service, it says a lot about what kind of person they are. And it also says a lot about what kind of company meta is.

permalink
report
parent
reply
10 points

I suspect it’s more affecting for younger people who don’t really think about the fact that in reality, no one has seen them naked. Probably traumatizing for them and logic doesn’t really apply in this situation.

permalink
report
parent
reply
23 points

Does it really matter though? “Well you see, they didn’t actually see you naked, it was just a photorealistic approximation of what you would look like naked”.

At that point I feel like the lines get very blurry, it’s still going to be embarrassing as hell, and them not being “real” nudes is not a big comfort when having to confront the fact that there are people masturbating to your “fake” nudes without your consent.

I think in a few years this won’t really be a problem because by then these things will be so widespread that no one will care, but right now the people being specifically targeted by this must not be feeling great.

permalink
report
parent
reply

It depends where you are in the world. In the Middle East, even a deepfake of you naked could get you killed if your family is insane.

permalink
report
parent
reply
7 points

It depends very much on the individual apparently. I don’t have a huge data set but there are girls that I know that have had this has happened to them, and some of them have just laughed it off and really not seemed like they cared. But again they were in their mid twenties not 18 or 19.

permalink
report
parent
reply
9 points
*

So we still dont know what these people actually look like naked.

I think the offense is in the use of their facial likeness far more than their body.

If you took a naked super-sized barbie doll and plastered Taylor Swift’s face on it, then presented it to an audience for the purpose of jerking off, the argument “that’s not what Taylor’s tits look like!” wouldn’t save you.

Technology is breaking our society

Unregulated advertisement combined with a clickbait model for online marketing is fueling this deluge of creepy shit. This isn’t simply a “Computers Evil!” situation. Its much more that a handful of bad actors are running Silicon Valley into the ground.

permalink
report
parent
reply
2 points

Not so much computers evil! as just acknowledging there will always be malicious actors who will find clever ways to use technology to cause harm. And yes, there’s a gathering of folk on 4Chan/b who nudify (denudify?) submitted pictures, usually of people they know, which, thanks to the process, puts them out on the internet. So this is already a problem.

Think of Murphy’s Law as it applies to product stress testing. Eventually, some customer is going to come in having broke the part you thought couldn’t be broken. Also, our vast capitalist society is fueled by people figuring out exploits in the system that haven’t been patched or criminalized (see the subprime mortgage crisis of 2008). So we have people actively looking to utilize technology in weird ways to monetize it. That folds neatly like paired gears into looking at how tech can cause harm.

As for people’s faces, one of the problems of facial recognition as a security tool (say when used by law enforcement to track perps) is the high number of false positives. It turns out we look a whole lot like each other. Though your doppleganger may be in another state and ten inches taller / shorter. In fact, an old (legal!) way of getting explicit shots of celebrities from the late 20th century was to find a look-alike and get them to pose for a song.

As for famous people, fake nudes have been a thing for a while, courtesy of Photoshop or some other digital photo-editing set combined with vast libraries of people. Deepfakes have been around since the late 2010s. So even if generative AI wasn’t there (which is still not great for video in motion) there are resources for fabricating content, either explicit or evidence of high crimes and misdemeanors.

This is why we are terrified of AI getting out of hand, not because our experts don’t know what they’re doing, but because the companies are very motivated to be the first to get it done, and that means making the kinds of mistakes that cause pipeline leakage on sacred Potawatomi tribal land.

permalink
report
parent
reply
2 points

This is why we are terrified of AI getting out of hand

I mean, I’m increasingly of the opinion that AI is smoke and mirrors. It doesn’t work and it isn’t going to cause some kind of Great Replacement any more than a 1970s Automat could eliminate the restaurant industry.

Its less the computers themselves and more the fear surrounding them that seem to keep people in line.

permalink
report
parent
reply
9 points

Technology isn’t doing shit to society. Society is fucking itself like it always has.

permalink
report
parent
reply
2 points

No it is the entities doing it

permalink
report
parent
reply
1 point

You’re an entity.

permalink
report
parent
reply
8 points

Regardless of what one might think should happen or expect to happen, the actual psychological effect is harmful to the victim. It’s like if you walked up to someone and said “I’m imagining you naked” that’s still harassment and off-putting to the person, but the image apps have been shown to have much much more severe effects.

It’s like the demonstration where they get someone to feel like a rubber hand is theirs, then hit it with a hammer. It’s still a negative sensation even if it’s not a strictly logical one.

permalink
report
parent
reply
-6 points

How dare that other person i don’t know and will never meet gain sexual stimulation!

permalink
report
parent
reply
13 points

My body is not inherently for your sexual simulation. Downloading my picture does not give you the right to turn it in to porn.

permalink
report
parent
reply
-1 points

Did you miss what this post is about? In this scenario it’s literally not your body.

permalink
report
parent
reply
-5 points

You get to tell me what i can and cannot think about in my own head?

permalink
report
parent
reply
-13 points

I think half the people who are offended don’t get this.

The other half think that it’s enough to cause hate.

Both arguments rely on enough people being stupid.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 18K

    Monthly active users

  • 10K

    Posts

  • 467K

    Comments