In an ACLU test, Amazon’s facial recognition tech wrongly matched mugshots to 28 members of Congress


The American Civil Liberties Union used facial recognition software to cross-reference a mugshot database with a list of Congress members. The test incorrectly matched 28 members of Congress with mugshots of people arrested for a crime.

The ACLU used Amazon’s app Rekognition to show how finicky facial recognition tech can be. “The members of Congress who were falsely matched with the mugshot database we used in the test include Republicans and Democrats; men and women; and legislators of all ages, from all across the country,” the ACLU said in a blog post.

The ACLU tested 25,000 publicly available arrest photos against each of the 535 House of Representative and Senate members. Of the entire group, 28 were matched with people who had been arrested — 25 from the House and three from Senate. The test only cost the ACLU $12.33.

Almost 40% of the matches were people of color, even though they only make up 20% of Congress, according to the ACLU. The Congressional Black Caucus penned a letter to Amazon’s Jeff Bezos in May, expressing fear in relation to the company’s facial recognition tech.

“It is quite clear that communities of color are more heavily and aggressively policed than white communities,” Cedric Richmond, chair of the Congressional Black Caucus, said in the letter. “This status quo results in an oversampling of data which, once used as inputs to an analytical framework leveraging artificial intelligence, could negatively impact outcomes in those over-sampled communities.”

Congress members of color were disproportionally matched in the ACLU’s test

The reason why facial recognition software fails darker-skinned people, according to Brian Brackeen, CEO of the facial recognition company Kairos, is due to how the software is calibrated.

“Many of these algorithms start in universities, where they use students on campus as data for initial training,” Brackeen told Mic in a prior interview. “If it only sees 12 faces of African descent and 1,000 people of European descent, it will become very adept at detecting European faces, more so than African. The algorithm itself isn’t essentially racist so much as the training.”

Amazon agrees. In a response sent to Verge by an Amazon spokesperson, the company said Rekognition’s faults can be blamed on improper calibration. According to the company, the ACLU’s test were performed with Rekognition’s confidence threshold at 80%. Amazon recommends law enforcement only trust results with a 95% threshold or higher. Still, the ACLU has petitioned Amazon to stop selling its facial recognition tech to the government. The petition has over 150,000 signatures.

Law enforcement already uses facial recognition to optimize the catching of people suspected of crimes. Amazon has given local law enforcement agencies in Oregon and Florida the same Rekognition software tested by the ACLU. The FBI uses facial recognition tech as well, though the bureau explains it’s only used to single out investigative leads, not to positively identify someone as a culprit. In China, facial recognition tech has been used to single out a wanted man in a crowded concert.

However, studies like Georgetown’s Perpetual Lineup have shown that many states, like Ohio, California and Pennsylvania, have used facial recognition tech that’s less than accurate. Poorly trained facial recognition apps have broader repercussions than just mistaken arrests. Parents of dark-skinned children could feel the effects too.

“If your child goes missing and you want to employ facial recognition to find them, you’re going to be really effing mad that it couldn’t find that child,” Brackeen said.

As Amazon noted, the company recommends to law enforcement that they only trust the software when it’s near full confidence. But the high rate of wrongfully matched Congress members of color could speak to Amazon needing to improve how Rekognition is trained.