Amazon's Facial Recognition Software Falsely Identified 27 Athletes As Criminals, Study Reveals
Stars like New England Patriot Duron Harmon said the technology was a danger to people of color.
October 28, 2019 at 11:16 pm
As part of their Press Pause on Face Surveillance campaign, the ACLU of Massachusetts released an alarming study on Amazon's facial recognition software. Rekognition reportedly falsely matched the faces of 27 New England Patriots athletes with those found in a criminal database.
The ACLU has joined legislators in waging a years-long battle against Amazon over their facial recognition software, which they have already sold to dozens of police and security organizations across the world. For years, Black researchers and scientists have said the software struggles to identify darker-skinned women and men.
“This technology is flawed,” said New England Patriots safety Duron Harmon, one of the 27 faces misidentified by Rekognition.
“If it misidentified me, my teammates and other professional athletes in an experiment, imagine the real-life impact of false matches. This technology should not be used by the government without protections. Massachusetts should press pause on face surveillance technology,” Harmon told the ACLU.
The ACLU of Massachusetts launched the campaign in June after two years of flippant responses from Amazon, which continues to brush off complaints about bias in their system. Dozens of companies like IBM, Microsoft and Face++ are working on facial recognition software, but have been more open to critique and working with researchers to tweak their technology.
Both IBM and Microsoft published blog posts and studies on their facial recognition software, working with outside researchers to improve the percentage of correct matches.
However, Amazon has refused to budge, claiming repeatedly over the years that researchers are intentionally misusing their software to prove it is biased against people with darker skin. Amazon complained that researchers put Rekognition on an 80% similarity threshold, the software's default settings, which is why they received so many false matches.
Amazon says it forces any police department to use a "99% match" setting but admitted to The New York Times earlier this year that it largely has no control over how their software is used.
"The ACLU is once again knowingly misusing and misrepresenting Amazon Rekognition to make headlines," an Amazon spokesperson said in a statement to Business Insider.
"When used with the recommended 99% confidence threshold and as one part of a human-driven decision, facial recognition technology can be used for a long list of beneficial purposes, from assisting in the identification of criminals to helping find missing children to inhibiting human trafficking," the spokesperson added.
Many legislatures across the country, including Massachusetts, are debating laws that would ban police departments from using the software out of fear it could introduce a wave of false arrests based on faulty imaging clues. Since it has been introduced in cities across the world, activists, researchers and legislators have complained about the massive number of false positives and the potential misuses of the technology.
Joy Buolamwini, founder of the Algorithmic Justice League and a researcher at the M.I.T. Media Lab, made waves over the last two years as a staunch opponent of Amazon's software. She wrote Amazon CEO Jeff Bezos a letter last year imploring him to pump the brakes on selling the technology.
"As the leader of the Algorithmic Justice League, and a concerned citizen, I join over 150,000 individuals, nearly 70 organizations, your 19 concerned shareholders and Amazon employees who have raised their voices in urging you to stop equipping law enforcement with facial analysis technology," Buolamwini wrote.
"Even if the Amazon Rekognition services and products you are selling to police departments were completely flawless, the potential for abuse on historically marginalized communities would not be reduced," she said.
Her research with the M.I.T. Media Lab in January discovered that Rekognition could accurately identify lighter skinned faces but misclassified women as men 19% of the time. More than 30 percent of darker-skinned women were mistaken for men.
Buolamwini made a YouTube video explaining how many of the companies working on facial recognition software were having trouble with women of color, particularly dark-skinned Black women by either misidentifying them as other people or as other genders. Even women like Michelle Obama were misidentified by the systems.
Amazon has trotted out Dr. Matt Wood, general manager of artificial intelligence, to attack Buolamwini and dispute her findings. He has written two lengthy blog posts in response to Buolamwini's work, slamming her for misunderstanding the software and for implying the company was unaware of the risks associated with any facial recognition software.
"The research paper in question does not use the recommended facial recognition capabilities, does not share the confidence levels used in their research, and we have not been able to reproduce the results of the study," Wood wrote in January in response to Buolamwini's work.
"The answer to anxieties over new technology is not to run ‘tests’ inconsistent with how the service is designed to be used, and to amplify the test’s false and misleading conclusions through the news media," he added.
For the ACLU of Massachusetts study, they took photos of 188 athletes from the Boston Bruins, Boston Celtics, Boston Red Sox and New England Patriots and placed them against a database of 20,000 public arrest photos. The ACLU said almost one in six athletes was falsely identified.
“The results of this scan add to the mounting evidence that unregulated face surveillance technology in the hands of government agencies is a serious threat to individual rights, due process and democratic freedoms,” said Kade Crockford, director of the Technology for Liberty Program at the ACLU of Massachusetts.
“Face surveillance is dangerous when it doesn’t work, and when it does. There are currently no rules or standards in place in our state to ensure the technology isn’t misused or abused. Massachusetts must pass a moratorium on government use of face surveillance technology until there are safeguards in place to keep people safe and free,” she added.
Legislators have also expressed anger at Amazon for their responses. The company has flat out refused to respond to requests from both state and federal lawmakers for more information about the software. They won't even tell lawmakers what government agencies are currently using the technology.
“Not only do I want to see them address our concerns with the sense of urgency it deserves, but I also want to know if law enforcement is using it in ways that violate civil liberties, and what — if any — protections Amazon has built into the technology to protect the rights of our constituents,” Representative Jimmy Gomez, a California Democrat who has been investigating Amazon’s facial recognition practices, said in an interview with The New York Times.