Amazon won't allow police to use its facial recognition technology for one year
As cities and municipalities across the country start to rethink how law enforcement should operate, companies are also beginning to reconsider their relationships with police. Just days after IBM revealed that it would stop selling and developing facial recognition technology out of concerns for potential human rights and privacy abuses, Amazon followed suit. On Wednesday, the company announced it would institute a one-year moratorium on police use of its facial recognition technology, Rekognition.
In a blog post, Amazon stated the reason for suspending law enforcement access to Rekognition is to give federal lawmakers adequate time to craft rules to regulate the use of facial recognition technology. "We’ve advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge," the company said. Amazon appears to be referencing the Justice in Policing Act, introduced earlier this week. The bill would, among other reforms, restrict how police can use facial recognition technology.
Amazon did not address concerns of racial and gender bias that have plagued its Rekognition technology for years. Last year, a study conducted by MIT Media Lab found that Rekognition regularly misidentified people with darker skin and would often mistake darker-skinned women for men. In 2018, the ACLU showed that Amazon's technology falsely matched 28 members of Congress, a disproportionate number of which were people of color, with mugshots. In 2018, the ACLU of Massachusetts ran a similar experiment, this time with professional athletes. Once again, it found that Rekognition falsely matched one out of every six athletes to mugshots. The sports figures who were incorrectly IDed as criminals were disproportionately Black.
Amazon has regularly pushed back against these charges, claiming that these tests tweak the technology to produce bad results or misrepresent how it is intended to be used. The company has also attempted to duck further scrutiny, choosing not to submit Rekognition to the National Institute of Standards and Technology for evaluation. Studies of Rekognition's accuracy have been troubling enough that dozens of AI researchers and experts have called on Amazon to stop the sale of its facial recognition technology to law enforcement out of concerns that it will bolster bias policing that hurts communities of color. Amazon, up until its decision to suspend use of Rekognition for one year, ignored these warnings. In fact, Amazon Web Services CEO Andy Jassy told PBS Frontline earlier this year that the company is unaware of just how many police agencies are using the technology, and admitted Amazon isn't fully aware of how the technology is being used.
Mic asked Amazon if the company will use the one-year moratorium to address racial and gender biases in its Rekognition technology, but the company did not respond.
While Amazon may be temporarily cutting off law enforcement, it said that it will continue allowing organizations like Thorn, the International Center for Missing and Exploited Children, and Marinus Analytics to use Amazon Rekognition to help identify human trafficking victims and find missing children. Mic asked Amazon if it will continue to allow Rekognition to be used by federal agencies, including Immigration and Customs Enforcement (ICE), and will update this post if we receive a response.