31 points

No reason not to ban them entirely.

The problem is enforcing the ban. Would it be a crime to have access to the software, or would they need to catch the criminals with the images and video files? It would be trivial to host a site in a country without legal protections and make the software available from anywhere.

permalink
report
reply
21 points

Would it be a crime to have access to the software, or would they need to catch the criminals with the images and video files?

Problem with the former is that would outlaw any self hosted image generator. Any image generator is capable of use for deep fake porn

permalink
report
parent
reply
1 point

Perhaps an unpopular opinion, but I’d be fine with that. I have yet to see a benefit or possible benefit that outweighs the costs.

permalink
report
parent
reply
1 point
*

The problem is the cat’s out of the bag.

Open source image generators already exist and have been widely disseminated worldwide.

So all you’d end up doing is putting up a roadblock for legitimate uses. Anybody using it to cause harm will not be seriously impeded. They can just pick up the software from a Russian/Chinese/EU host or less official distribution methods.

It would be as effective as the US trying to outlaw the exporting of strong encryption standards in the 90s. That is to say, completely ineffective and actually harmful. Enemies of the US were still using strong encryption anyway.

permalink
report
parent
reply
6 points

Right, this is my point. The toothpaste is out of the tube. So would simply having the software capable of making deepfake porn be a crime?

permalink
report
parent
reply
11 points

I feel like a sensible realistic course of action with this is it needs to be in the act of sharing/distributing. It would be way to broad otherwise as the tools that generate this stuff have unlimited purposes. Obvious child situations should be dealt with in the act of production of but the enforcement mechanism needs to be on the sharing/distribution part. Unfortunately analogy is blame the person not the tool on this one.

permalink
report
parent
reply
-6 points

Yeah, I feel like if you find this shit on someone’s computer, whether they shared it or not, there should be some consequences. Court-mandated counseling at a minimum.

permalink
report
parent
reply
6 points

Right. And honestly, this should already be covered under existing harassment laws.

permalink
report
parent
reply
0 points

probably a good thing but just banning something doesnt do anything - you have to enforce it by getting rid of the software & then keep enforcing it

permalink
report
reply
1 point

I don’t know why you got down voted, youre right. This is going to be ridiculously hard to enforce

permalink
report
parent
reply
0 points
*
Removed by mod
permalink
report
parent
reply
83 points
*

The laws regarding a lot of this stuff seem to ignore that people under 18 can and will be sexual.

If we allow people to use this tech for adults (which we really shouldn’t), then we have to accept that people will use the same tech on minors. It isn’t even necessarily pedophilia on all cases (such as when the person making them is also a minor)*, but it’s still something that very obviously shouldn’t be happening.

* we don’t need to get into semantics. I’m just saying it’s not abnormal (the way pedophilia is) for a 15-year old to be attracted to another 15-year old in a sexual way.

Without checks in place, this technology will INEVITABLY be used to undress children. If the images are stored anywhere, then these companies will be storing/possessing child pornography.

The only way I can see to counteract this would be to invade the privacy of users (and victims) to the point where nobody using them “”“legitimately”“” would want to use it…or to just ban them outright.

permalink
report
reply
20 points

such as when the person making them is also a minor

I get the point you’re tying to make. But minors taking nudes of themselves is illegal in a lot of places, because it’s still possession.

permalink
report
parent
reply
56 points

And that’s still a bit messed up. It’s a felony for a teen to have nude pictures of themselves and they’ll be registered sex offenders for life and probably ineligible for most professions. Seems like quite a gross over reaction. There needs to be a lot of reform in this area but no politician wants to look like a “friend” to pedophiles.

permalink
report
parent
reply
9 points

It does seem a bit heavy handed when the context is just two high schoolers tryna smash.

permalink
report
parent
reply
22 points

Which is more of a “zero-tolerance” policy, like unto giving the same punishment to a student defending themselves as the one given to the person who initiated the attack.

permalink
report
parent
reply
10 points

I get the point you’re tying to make. But minors taking nudes of themselves is illegal in a lot of places, because it’s still possession.

I agree, and on the one hand, I understand why it could be good to consider it illegal (to prevent child porn from existing), but it does also seem silly to treat it as a case of pedophilia.

permalink
report
parent
reply
1 point
*

Not just silly. Extremely damaging. We don’t even treat most other crimes minors commit this way. Records can often be expunged for other crimes. At the age of 18 they are generally sealed. But not in this case.

This is the government doing a bad job of regulating technology they do not fully understand the scope of in an attempt to save the children by punishing them sometimes for life. Over what essentially amounts to heavy flirting between people of their own age group.

Child porn is not okay and it should be illegal. But the law cannot always be applied in a way that is equal because a kid sending another kid a nude of themselves is not the same as an adult using the nude of a child for sexual gratification or excitement. One of those things is a natural normal thing. The other is extremely reprehensible and damaging to the victims used to create those images.

permalink
report
parent
reply
-146 points

That’s a lot of words to defend fake child porn made out of photos and videos of actual children.

permalink
report
parent
reply
48 points

That’s about the right amount of words to completely ignore the sentiment of a statement so you can make a vapid holier-than-thou statement based on purported moral superiority.

permalink
report
parent
reply
4 points

What a dumb take. And I do those myself, so I know one if I see one.

permalink
report
parent
reply
1 point

I hope “those” refers to the dumb takes and not the nude photos of minors

permalink
report
parent
reply
17 points

That’s a lot of words to defend fake child porn made out of photos and videos of actual children.

Uh…this is the second sentence or so (and the start of the second paragraph, I think)

If we allow people to use this tech for adults (which we really shouldn’t)

So I’m not sure where you got the idea that I’m defending AI-generated child porn.

Unless you’re so adamant about AI porn generators existing that banning their usage on adults (or invading the privacy of the users and victims with oversight) is outright unthinkable? Lol

I’m saying that IF the technology exists, people will be using it on pictures of children. We need to keep that in mind when we think about laws for this stuff. It’s not just adults uploading pictures of themselves (perfectly fine) or adult celebrities (not fine, but probably more common than any truly acceptable usage).

permalink
report
parent
reply
31 points

Have you tried actually reading what they said instead of just making shit up?

permalink
report
parent
reply
12 points

But I want to be outraged now!

permalink
report
parent
reply
98 points

Reading comprehension not a strong suit? Sounds to me they’re arguing for protections for both adults AND minors.

permalink
report
parent
reply
16 points

Words is treacherous bastards

permalink
report
parent
reply
11 points

For some reason I thought it was mainly to protect Taylor Swift, with teen girls being the afterthought.

permalink
report
reply
3 points

Won’t somebody please think of Taylor?!

permalink
report
parent
reply
3 points

But not that way…

permalink
report
parent
reply
2 points
*
Removed by mod
permalink
report
parent
reply
40 points
*

This is probably not the best context but I find it crazy how fast the government will get involved if it involves lude content but children are getting mudered in school shootings and gun control is just a bridge too far.

permalink
report
reply
11 points
*

I think they act faster on those matters because, aside from being a very serious problem, they also have a conservative agenda.

Is very easy to say: “LOOK, WE ARE DOING THIS TO PROTECT YOUR CHILDREN FROM PEDOPHILES!!!”

But they can’t just go and say “let’s enforce gun safety on schools”, because having a conservative voter reading “gun safety” will already go bad for them.

They know they are sacrificing the well being of children by not acting on the school shootings, but for them is just the price of a few lives to stay in power.

permalink
report
parent
reply
5 points
*

Are quaaludes even still available in 2024?

Or did you mean to say “lewd”?

permalink
report
parent
reply
-5 points

If only their were context clues… oh wait you’re just being a jerk.

permalink
report
parent
reply
7 points

Did you mean “there were”?

permalink
report
parent
reply
1 point

Sorry can’t help it; I’m an energy vampire and we tend to be jerks. Got it from my dad.

permalink
report
parent
reply
2 points

There’s no competing interests when it comes to protecting child from child sexual exploitation. When it comes to protecting them from guns, there is the competing interest of the second amendment.

permalink
report
parent
reply
3 points
*
Removed by mod
permalink
report
parent
reply
2 points

Those shootings don’t happen in private schools.

Nudes happen in private schools.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 18K

    Monthly active users

  • 12K

    Posts

  • 553K

    Comments