Nijeer Parks became the third person to come forward as a victim of mistakes made by facial recognition software used widely within police departments and law enforcement agencies, according to his lawsuit and an interview with The New York Times.

Parks is suing the city of Woodbridge, New Jersey after he was forced to spend 10 days in jail and more than $5,000 to defend himself after police mistakenly arrested him based on faulty information provided by facial recognition software. 

“I was locked up for no reason. I’ve seen it happen to other people. I’ve seen it on the news. I just never thought it would happen to me. It was a very scary ordeal,” Parks said. 

Earlier this year, two other Black men filed lawsuits reporting that they had been arrested and interrogated based on false matches made by facial recognition software.

Black scientists have been at war with companies like Amazon, IBM and Microsoft for years, annually testing and repeatedly proving that all of these systems still struggle with identifying Black faces and are in general inaccurate.

In January 2019, Woodbridge Police officers were investigating a robbery at a Hampton Inn. They confronted the suspect and he gave them a fake ID from Tennessee before fleeing the scene in a rental car. 

Even though the officers knew the ID was fake, they still used it as a point of reference in their investigation. They sent the photo on the fake ID, which was not of the person they were looking for, to multiple state agencies that use some version of facial recognition software. 

Parks, who lives 30 miles away, was matched with the photo from the fake ID, according to NJ.com. He was nowhere near the crime, had an alibi, and he said there was no way he could have been the culprit because he does not have a driver's license and cannot drive. 

Despite the obviousness of the mistake, it nearly changed Parks' life forever.

Ten years ago, Parks was arrested twice for selling drugs and went to prison, getting out in 2016, The New York Times reported. If convicted in this latest case, he would have gotten considerable time because it would have been his third conviction.

Once police made the facial recognition match, they arrested Parks and held him in Middlesex County Corrections Center for 10 days. 

Yet another criminal justice algorithm worked against Parks, but this time it was the no-bail system in New Jersey. The state uses an algorithm to determine the "risk" associated with allowing a person out on bail before trial. Because of Parks' past convictions, he was not released before trial, forcing his family members to step in and hire a lawyer.

In his lawsuit, Parks described the lackluster defense initially given to him by his public defender, who did not highlight Parks' repeated statements that he could easily get proof that he was at a Western Union in Haledon, New Jersey, at the time of the crime. 

Parks told The New York Times that he nearly took a plea deal out of fear that going to trial, even as an innocent person, would lead to a harsher sentence.  

He decided to fight the case in court and was eventually released, with the case being dropped months later. 

In his lawsuit, Parks slams the police department for using the facial recognition software and not checking whether it was accurate before arresting a person.

"No reasonable officer would be able to consider that it was reasonable to issue an arrest warrant on the basis of facial recognition technology that is known to be faulty and untrustworthy," the lawsuit read. 

There continues to be sustained backlash against the use of facial recognition software despite its widespread adoption by law enforcement agencies, airports, and even schools, as Blavity previously reported.

After the protests over George Floyd's killing this summer, the major companies behind this facial recognition software, namely Amazon, IBM and Microsoft, all pledged to honor a one-year moratorium on its use by police departments.

But activists at the time noted that there was no way for anyone to be sure that police departments would no longer have access, and each company released vague statements on what the moratorium actually meant. 

MIT researchers Joy Buolamwini and Inioluwa Deborah Raji conducted a widely read study showing Rekognition struggles to identify the faces of Black and brown people, particularly Black women.

One of the main problems they highlighted was the ability to set certain thresholds in a system, meaning you are willing to accept facial recognition matches that may be less than 90 percent, or 85 percent, or even 75 percent. 

Amazon has told all police departments that they should only use their facial recognition system at a 95 percent threshold, yet police departments have already said they do not do this, with most using the software at the 80 percent threshold that the program is set to at first. All of the studies done by researchers use the 80 percent threshold as the benchmark.

After the one-year moratorium was announced this summer, the ACLU
released a scathing statement, demanding all of these companies stop selling these facial recognition programs for good. 

"It is about time Amazon started recognizing the dangers face recognition surveillance poses to Black and brown communities and civil rights more broadly. Face recognition technology gives governments the unprecedented power to spy on us wherever we go. It fuels police abuse. These dangers will not just go away in a year," the ACLU wrote on Twitter in June.

"Amazon must commit to a full stop of its face recognition sales until the dangers can be fully addressed," the ACLU added. "It must also urge Congress and legislatures across the country to pause law enforcement use of the technology. We will continue the fight to defend our privacy rights from face surveillance technology. That means you're next, Microsoft."