One group that doesn’t love this technology is sex workers. The software can’t credibly discern between real escort ads and sex-trafficking-disguised-as-escort ads, meaning consenting adults often get swept up in its surveillance. In fact, a recent study funded by the Department of Justice found that police regularly mistake certain “red flags” in escort ads — like 24/7 availability or the use of specific emojis — for signs of trafficking.
That Thorn uses Amazon’s facial recognition tool is especially contentious. Research by MIT and the ACLU has shown that it falsely identified people of color, and the company itself has banned police departments from using Rekognition, except in trafficking cases through software like Spotlight.
In 2011, a public-awareness campaign for DNA featured Donald Trump, Jamie Foxx, and the slogan “Real Men Don’t Buy Girls.”
Donald - “they just let you do it” - “beauty pageant dressing room” - Trump
🙄
Kutcher and his first wife, Demi Moore, founded an organization called DNA in 2009 after watching a Dateline special
Combine this with Kutcher not knowing Masterson and scientology numerous trafficking accusations really illuminates how unserious they were about the program.
This was always just a PR move for Kutcher, so he doesn’t care if he is doing damage to legitimate sex workers. Much like FOSTA/SESTA, it’s real easy for anyone who wants to boost their image, lawmakers included, to play the hero as you take a “protect the children at all costs zero tolerance” stance and handwave away the harm you do “helping”.
That Thorn uses Amazon’s facial recognition tool is especially contentious. Research by MIT and the ACLU has shown that it falsely identified people of color,
No, what the ACLU did was knowingly leave the default setting of 80% confidence and do a “surprised” face when they got almost exactly 20% false detections.
They knew exactly what they were doing.
Amazon even responded to their claims criticising the lack of proper setup for such a complex system. But the ACLU’s excuse was that it was the default setting, so they just used whatever it came with out of the box.
So all they managed to prove is that the default setting isn’t adequate for accurate identification, and has nothing to do with what the system is able to do when correctly configured.
Edit: I see I’m being downvoted for stating facts.
Is there any evidence to suggest that law enforcement agencies have these complex systems properly configured? I’ve seen plenty of articles talking about minorities being arrested after some facial recognition software misidentified them. Claiming that the ACLU isn’t using the software properly doesn’t mean that anyone else is using it properly.
You’re talking about something else entirely. The ACLU’s argument is “these systems are so bad we can’t rely on them” and your argument is “law enforcement may not have them configured correctly”.
One of those is factually false.
That being said, every FR system is built differently, and have their own advantages and considerations. But from what I’ve seen in the news over the past few years is almost always a policy and procedure failure. At some point between using a photo of such low quality that it shouldn’t be used to the verifying officer looking at the source photo and recognizing the current suspect are different people, something broke down.
I’m actually astonished at how bad the average person is at comparing photos of people. Just look up the conspiracy nonsense these flat earthers go on about regarding the Challenger accident. They are convinced that each person that died are actually still alive and living under a new name. Then they show their evidence and I couldn’t believe what I was seeing. Sure, these people are similar enough that they could fit a verbal description, but when you actually compare features it’s so easy to see they’re all different people and can’t be the same.
I know it’s like that with some cops, because I know some people in emergency services that have been taking FR courses. They told me that so many departments (fire, police, 911 dispatch, forensics, etc) are being trained on it. And not for the software, it’s for physically identifying people. With this tech and these false arrests I guess it’s come to light that some people, cops or not, lack the fundamental ability to see minor but critical differences in facial anatomy.
Ultimately, whatever a computer system says, a person is making the final decision to arrest these people. This is where the issue lies.
Edit: I guess downvotes mean “I don’t like that you’re right” here also. I worked in the FR field for almost a decade. I’m familiar with the topic.
Ashton Kutcher.
Interesting article though I would not expect a technology company to have all of the answers. The police are doing the actual work and need to be held accountable.
Oh for the love of god. When did we get to the point where EVERYTHING is a sinister plot hook.
Fucking lighten up.
Because it’s a self interested millionaire with a sketchy past, nothing he does is altruism, it’s all PR and self jerking.
Nothing ANYONE does is altruism anymore. Because no matter who they are, or what they do- someone will find something to bitch about or be offended by.
The app is being used by police to persecute sex workers and marks, not to find children who are victims of child sex trafficking - because child sex traffickers are not stupid enough to give their “product” to anyone who would instantly put their faces on the internet for anyone to find.
Sex trafficking is a serious issue, but Kutcher nor his company has done their due diligence and are actually helping to persecute sex workers.
I’m not saying that it was intentional, but it does make the whole supposed effort seem like it’s more about optics and marketing.