Taylor Swift is living every woman’s AI porn nightmare — Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.::Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.

105 points

I feel like I live on the Internet and I never see this shit. Either it doesn’t exist or I exist on a completely different plane of the net.

permalink
report
reply
19 points

You ever somehow get invited to a party you’d usually never be at? With a crowd you.never ever see? This is that.

permalink
report
parent
reply
26 points

You ever somehow get invited to a party you’d usually never be at? With a crowd you.never ever see?

No?

permalink
report
parent
reply
25 points

You ever somehow get invited to a party

Also no 😥

permalink
report
parent
reply
11 points

Understandable, have a nice day

permalink
report
parent
reply
2 points

Why not just say “no, thank you” to such an invitation?

permalink
report
parent
reply
2 points

Well obviously you’d leave pretty immediately. A friend’s has never invited you somewhere new/ unknown?

permalink
report
parent
reply
2 points

Because that’s how you meet new people?

permalink
report
parent
reply
0 points
Deleted by creator
permalink
report
parent
reply
14 points

On the Internet, censorship happens not by having too little information, but too much information in which it is difficult to find what you want.

We all have only so much time to spend on the Internet and so necessarily get a filtered experience of everything that happens on the Internet.

permalink
report
parent
reply
1 point

Are you saying that not being exposed to things you don’t want is censorship?

permalink
report
parent
reply
1 point

No, that is not what I’m saying, mostly because I don’t think it is true. I’m saying that nowadays there is nearly all kinds of information one can think of somewhere out there on the Internet; but if it is only in relatively obscure places and you don’t know where to look for it, then it is still de facto censored by having too much other information out there.

permalink
report
parent
reply
71 points

I wonder if this winds up with revenge porn no longer being a thing? Like, if someone leaks nudes of me I can just say it’s a deepfake?

Probably a lot of pain for women from mouth breathers before we get there from here .

permalink
report
reply
56 points
*

I mean, not much happened to protect women after The Fappening, and that happened to boatloads of famous women with lots of money, too.

Arguably, not any billionaires, so we’ll see I guess.

permalink
report
parent
reply
18 points

This has already been a thing in courts with people saying that audio of them was generated using AI. It’s here to stay, and almost nothing is going to be ‘real’ anymore unless you’ve seen it directly first-hand.

permalink
report
parent
reply
54 points

first-hand.

the first-hand in question:

permalink
report
parent
reply
4 points
*

If shitty non-real person AI-generated image deformity porn with body parts like this image isn’t real, I bet it will be. There, you’re all welcome.

permalink
report
parent
reply
2 points

Thereby furthering erosion of our democracies and continuing the slide into Putin-confusion.

permalink
report
parent
reply
1 point

We need trustworthy sources of news more than ever

permalink
report
parent
reply
9 points

Why would it make revenge porn less of a thing? Why are so many people here convinced that as long people say it’s “fake” it’s not going to negatively affect them?

The mouth breathers will never go away. They might even use the excuse the other way around, that because someone could say just about everything is fake, then it might be real and the victim might be lying. Remember that blurry pictures of bigfoot were enough to fool a lot of people.

Hell, even others believe it is fake, wouldn’t it still be humilliating?

permalink
report
parent
reply
9 points

I think you’re underestimating the potential effects of an entire society starting to distrust pictures/video. Yeah a blurry Bigfoot fooled an entire generation, but nowadays most people you talk to will say it’s doctored. Scale that up to a point where literally anyone can make completely realistic pics/vids of anything in their imagination, and have it be indistinguishable from real life? I think there’s a pretty good chance that “nope, that’s a fake picture of me” will be a believable, no question response to just about anything. It’s a problem

permalink
report
parent
reply
0 points

There are still people to believe in Bigfoot and UFOs, there’s still people falling for hoaxes every day. To the extent that distrust is spreading, it’s not manifested as widespread reasonable skepticism but the tendency to double down on what people already believe. There are more flat earthers today than there were decades ago.

We are heading to a point that if anyone says deepfake porn is fake, regardless of reasons and arguments, people might just think it’s real just because they feel like it might be. At this point, this isn’t even a new situation. Just like people skip reputable scientific and journalistic sources in favor of random blogs that validate what they already believe, they will treat images, deepfaked or not, much in the same way.

So, at best, some people might believe the victim regardless, but some won’t no matter what is said, and they will treat them as if those images are real.

permalink
report
parent
reply
-2 points

I hope someone sends your mom a deepfake of you being dismembered with a rusty saw. I’m sure the horror will fade with time.

permalink
report
parent
reply
1 point

The default assumption will be that a video is fake. In the very near future you will be able to say “voice assistant thing show me a video of that cute girl from the cafe today getting double teamed by robocop and an ewok wearing a tu-tu”. It will be so trivial to create this stuff that the question will be “why were you watching a naughty video of me” rather than “omg I can’t believe this naughty video of me exists”.

permalink
report
parent
reply
1 point

The mouth breathers will never go away. You’re the mouth breather.

permalink
report
parent
reply
8 points

Australia’s federal legislation making non-consensual sharing of intimate images an offense includes doctored or generated images because that’s still extremely harmful to the victim and their reputation.

permalink
report
parent
reply
7 points

Why do you think “there” is meaningfully different from “here”?

permalink
report
parent
reply
5 points

A deep fake is still humiliating

permalink
report
parent
reply
0 points
Deleted by creator
permalink
report
parent
reply
71 points

I’ll wait for Taylor’s version.

permalink
report
reply
2 points

I wasn’t expecting a joke this good. Thank you for that.

permalink
report
parent
reply
63 points
*

Fake celebrity porn has existed since before photography, in the form of drawings and impersonators. At this point, if you’re even somewhat young and good-looking (and sometimes even if you’re not), the fake porn should be expected as part of the price you pay for fame. It isn’t as though the sort of person who gets off on this cares whether the pictures are real or not—they just need them to be close enough that they can fool themselves.

Is it right? No, but it’s the way the world is, because humans suck.

permalink
report
reply
49 points

Honestly, the way I look at it is that the real offense is publishing.

While still creepy, it would be hard to condemn someone for making fakes for personal consumption. Making an AI fake is the high-tech equivalent of gluing a cutout of your crush’s face onto a playboy centerfold. It’s hard to want to prohibit people from pretending.

But posting those fakes online is the high-tech, scaled-up version of xeroxing the playboy centerfold with your crush’s face on it, and taping up copies all over town for everyone to see.

Obviously, there’s a clear line people should not cross, but it’s clear that without laws to deter it, AI fakes are just going to circulate freely.

permalink
report
parent
reply
10 points

AI fake is the high-tech equivalent of gluing a cutout of your crush’s face onto a playboy centerfold.

At first I read that as “cousin’s face” and I was like “bru, that’s oddly specific.” Lol

permalink
report
parent
reply
2 points

Yup, it’s all the more frustrating when you take into account that social media sites do have the capability to know if an image is NSFW, and if it matches the face of a celebrity. Knowing Taylors fan base, they are probably quickly reported.

It’s mainly twitter as well, and it’s clear they are letting this go on to drum up controversy.

permalink
report
parent
reply
5 points

Humans are horrible, but a main-stream social media platform should not be a celebration of it. People need to demand change and then leave if ignored. I seem to hear people demanding change. The next step has more impetus.

permalink
report
parent
reply
48 points

WHAT?? DISGUSTING! WHERE WOULD THESE JERKS PUT THIS ? WHAT SPECIFIC WEBSITE DO I NEED TO BOYCOTT?

permalink
report
reply
27 points

Google Search didn’t really turn up much, far less than if you were to search up something like ‘Nancy Pelosi nude’ even, it kind of seems overblown and the only reason it’s gotten any news is because of who it happened to. Just being famous nowadays seems like you’re just going to see photoshopped or deepfake porn of yourself spread all over the internet.

permalink
report
parent
reply
1 point

Was this joke ever funny?

permalink
report
parent
reply
-20 points

Time and place. Don’t be a creep.

Yes I get the reference.

permalink
report
parent
reply
-2 points

Lol, seems like some people didn’t get it.

permalink
report
parent
reply
1 point

I didn’t get it.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 17K

    Monthly active users

  • 12K

    Posts

  • 554K

    Comments