close
close

Why police must stop using facial recognition technologies

OOn January 9, 2020, Detroit Police Department (DPD) officers arrested me on my lawn in Farmington Hills, Michigan, in front of my wife and two young daughters, for a crime I had nothing to do with. They refused to tell me why, and I had to spend the night sleeping on a cold concrete bench in a crowded and filthy jail cell before eventually discovering that I was falsely accused of stealing designer watches from a Detroit store.

While questioning me, two detectives blurted out that I had been arrested based on incorrect facial recognition identification, a technology that has been shown to be both racist and flawed, especially when used in real conditions, such as with fuzzy security. pictures.

This week, we finally reached a settlement in my wrongful arrest lawsuit against the City of Detroit, ensuring that what happened to me will never happen again.

Facial recognition technology has access to massive databases of millions of photos – including, at the time of my arrest, a database of 49 million photographs containing every Michigan driver’s license photo for years . Anyone with a driving license can be included in these databases. The technology scans them all for similar faces and spits out a few potential suspects. The police would tell you that they only use this information as a “lead” and then carry out serious investigations, but my own personal experience, and that of others wrongly arrested across the country, refutes that claim.

For example, the system managed to return the photo of my expired driver’s license as an “investigative lead” that might match the thief. Rather than investigate the accuracy of this supposed match, the police accepted the “lead,” putting my photo in a list with five other photos of black men, each of whom looked like less like the thief since a computer algorithm had not decided those The photos looked enough like the robber’s that they could be considered similar. The witness (who hadn’t even seen the crime happen, but had simply viewed the security footage) picked my photo out of this faked set of images. And it was on this evidence alone that the DPD police relied to arrest me.

Learn more: Artificial intelligence has a problem with gender and racial bias. Here’s how to fix this problem

When I was finally released after 30 hours, I learned that my oldest daughter had lost her first tooth while I was in jail—a precious childhood memory now distorted by trauma for our entire family. She also turned over a photo of our family because she couldn’t stand to see my face after watching the police take me away. The girls even started playing cops and robbers and telling me I was the thief. There have been many moments over the past four years when I’ve had to try to explain to two little girls that a computer had wrongly sent their father to jail.

The false accusations were eventually dropped, but not before I had to defend myself in court against something I didn’t do. After the charges were dropped, I demanded that the officers apologize and urged them to stop using this dangerous technology. They ignored me.

Since my story went public in 2020, we have learned that two other Black people in Detroit, Porcha Woodruff and Michael Oliver, were also wrongly arrested for crimes they did not commit, due to police searches errors made using facial recognition technology. Similar stories continue to occur across the country.

In a fairer world, cops would no longer be allowed to use this technology. While this regulation can only go so far, the DPD’s use of this dangerous and racist technology will now be much more tightly controlled. They will not be able to conduct a photoshoot based solely on a lead from facial recognition. Instead, they can only make an identification after using facial recognition if they first uncover independent evidence linking the person identified by facial recognition to a crime. In other words, DPD can no longer replace facial recognition with basic investigative police work.

Their obligations don’t stop there. Every time the DPO uses facial recognition in an investigation, they must inform the courts and prosecutors of any flaws or weaknesses in the facial recognition search they conducted, such as poor photo quality as in my case where grainy security footage was used. The DPO will also, for the first time, have to train its officers on the limitations and inaccuracies of facial recognition technology, including how it misidentifies black people at much higher rates than white people.

What the Detroit Police put me through changed my life forever. When I was taken to prison, I felt like I was in a bad movie that I couldn’t get out of. For several years since my wrongful arrest, my family and I have traveled across Michigan and the country to urge policymakers to protect their constituents from the horror I experienced by stopping law enforcement from using this technology misused. I have explained many times that I do not want anyone to live with the fear and trauma that facial recognition technology has inflicted on my family. With the settlement of my case, we are taking a big step toward that goal.