I'm glad that the man 'charging toward someone with a knife' was apprehended with facial recognition technology. In general I favor its use for catching violent criminals.
'Austin police officers have received the results of at least 13 face searches from a neighboring police department since the city’s 2020 ban — and have appeared to get hits on some of them, according to documents obtained by The Post through public records requests and sources who shared them on the condition of anonymity.
“That’s him! Thank you very much,” one Austin police officer wrote in response to an array of photos sent to him by an officer in Leander, Tex., who ran a facial recognition search, documents show. The man displayed in the pictures, John Curry Jr., was later charged with aggravated assault for allegedly charging toward someone with a knife, and is currently in jail awaiting trial. Curry’s attorney declined to comment.'
> And in New Jersey, police wrongfully arrested Nijeer Parks in 2019 after face recognition technology incorrectly flagged him as a likely match to a shoplifting suspect. An officer who had seen the suspect (before he fled) viewed the face recognition result, and said he thought it matched his memory of the suspect’s face.
I bet this officer also sent the same email: "That’s him! Thank you very much," before arresting an innocent man.
You will get false positives with eyewitness identification as well. But if you only use one identification method you'd never know if another would disagree.
I'd like to emphasize that this is arguably the same method (facial similarity) being attempted twice, rather than two different and independent methods. While a marginal improvement, each attempt could go wrong for shared or similar reasons--for example, facial-recognition AI can also suffer from the "all those minorities look alike" problem.
Consider the contrast between:
1. "A human who saw the culprit's face believes your face matches, and a computer looking at video-footage agreed."
2. "A human who saw the culprit's face believes yours matches, and your car is the same color and model as the car the culprit drove off in."
False positives are the main problem here, exactly. We'd all feel a lot better about facial recognition technology with police reform that ended/compensated the widespread harm being done to mere suspects, as well as fixing courtroom standards of evidence to take into account that things like facial recognition are essentially pulling signal out of noise. If facial recognition turns police onto a suspect who is then found to possess the stolen items, the latter is corroborating evidence that they've found the right person. If facial recognition turns police onto a suspect who then merely looks like what eyewitnesses remember, that part of the witness's recollection is inherently correlated to how the suspect was picked and so it needs to be disregarded.
Wait, having a computer saying that your face matches a video does not prove that you are the person on the video. Sure, the police can talk to you, but that is not a sufficient piece of evidence.
Okay, in that case, the victim (as sole witness) said that she recognized the defendant as the perpetuator. I am not a lawyer, but not sure if nowadays you could go to jail just because a victim (and sole witness) recognize you ? That seems rather slim evidence, and could be weaponized.
If I have to chose between a state where that dude with the knife is still at large, and a state where people are too afraid of dissent because they're worried about their attendance at a protest showing up on their public record, I'll take the dude with the knife.
Yeah, having police say, we could find out who the guy charging towards someone with a knife is, bland we have the technology, but we can’t do it is not likely to be a winning political argument.
I am also in favor of catching violent criminals. And as such I join you in the downvote fest. Of course, no one puts forth a cogent argument about why catching violent criminals with this technology is bad. Can it be abused? Yes. Was it abused here? No. Luckily, we live in a democracy where we make the rules. If we want a policy that law enforcement can't buy these systems, make that the law. If we want a policy where law enforcement can't use these systems in any capacity, then make that the law.
'Austin police officers have received the results of at least 13 face searches from a neighboring police department since the city’s 2020 ban — and have appeared to get hits on some of them, according to documents obtained by The Post through public records requests and sources who shared them on the condition of anonymity. “That’s him! Thank you very much,” one Austin police officer wrote in response to an array of photos sent to him by an officer in Leander, Tex., who ran a facial recognition search, documents show. The man displayed in the pictures, John Curry Jr., was later charged with aggravated assault for allegedly charging toward someone with a knife, and is currently in jail awaiting trial. Curry’s attorney declined to comment.'