Happy or unhappy I feel like body cam footage is too important a form of evidence to have reviewed by AI
Is this the kind of thing anyone could be happy about?
Cops reviewing themselves, we know how that works out.
- Cops being reviewed by shitty AI.
- ???
- ???
- wtf
Then again, when the police union doesn’t like something, makes me wonder what it’s exposing about them…
AI can’t be the last word in what gets marked for misconduct etc., however using it as a screening tool for potentially problematic moments in a law enforcement officer’s encounters would be useful. It’s an enormous task to screen through those hours upon hours of video and probably prohibitively expensive for humans to work through.
Need to be certain the false negative rate is near zero though, or important events could be missed, and with AI its nearly impossible to say that with certainty.
dont hey always tell us
‘you have nothing to worry about if you have nothing to hide’
uh huh… hows that feel now, government employee?
Yep.
Ya 'all like surveillance so much, let’s put all government employees under a camera all the time. Of all the places I find cameras offensive, that one not so much.
I sure hope you get your daily dosis of enjoying people’s misery watching the substitute teacher crying in the teacher’s lounge.
Cameras in a teacher’s lounge would be ridiculous but, in principle, cameras in classrooms make a lot of sense. Teachers are public officials who exercise power over others, and as such they need to be accountable for their actions. Cameras only seem mean because teachers are treated so badly in other ways.
If the police unions don’t like it, then it’s certainly going to be a positive step towards public safety.
The union will demand to train the model that scans the footage… In the future
I am so confused by this, why does there need to be AI involved in this at all?
If somebody has a complaint, pull the footage, then the plaintiff goes over the footage and makes their case against the police officer. Why would an AI be necessary to find complaints that are not being complained about?
I feel like it’s a technology solution for what should be a “more transparency and a better system” solution. Make complaints easier and reduce the fear factor of making complaints.
The people most likely to be abused by police are the least likely to be able or willing to file a formal complaint.
So fix that. Don’t make an AI to dole out justice against police like some messed up lottery. This is such a hollow solution in my mind. AI struggles to identify a motorcycle, people expect it to identify abuse?
So fix that.
Were it so simple, it would have been fixed decades ago. The difference is that having AI review the footage is actually feasible.
I’m just theorising how AI could be used, but consider the situation where someone makes a complaint, but doesn’t remember the exact time of the incident (say they remember it was within a six hour time frame for this example), or what the officer looked like.
You have (for example) 20 officers on duty it could be potentially be, in a six hour time frame, that’s 120 hours or 5 days of footage. An AI can use facial recognition to find the complainant within minutes.