A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey.
Meanwhile, on the other side of the country, officials are investigating an incident involving a teenage boy who allegedly used artificial intelligence to create and distribute similar images of other students – also teen girls - that attend a high school in suburban Seattle, Washington.
The disturbing cases have put a spotlight yet again on explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate. According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.
Well that’s an emotional response that includes no specifics or appeals to logic.
You clearly have no logic so I’m not going to appeal to it. Just a general comment.
I am shocked to my core that you would write off an observation as having no logic. Shocked I tell you!
I am perfectly cool with animated depictions of child sexual exploitation being in the same category as regualr child exploitation regardless
Not an observation. You’re saying you don’t care about actual victims, children actually being abused. Real, live victims. That’s not worse than someone drawing some pictures you don’t like?
Wow, you got me. That’s totally what I am saying that you had to fabricate a quote that represents the argument you want to have.
First of all I didn’t say any of things you’re inferring from the quote you posted.
But yes, I think sexually exploitative imagery of children is just as vile and disgusting as behavior that directly harms children and very indicative of someone who may attempt to harm a child in the future.