Amazon says it won’t allow law enforcement to use Rekognition, its facial recognition technology, for a year.
“We’ve advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge,” Amazon said in a blog post Wednesday. “We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested.”
The company, however, said it will continue to let organizations such as Thorn, the International Center for Missing and Exploited Children, and Marinus Analytics to use Amazon Rekognition to help rescue human trafficking victims and reunite missing children with their families.
Amazon’s decision to suspend police use of the software follows a two-year clash between the company and civil liberties activists, who have voiced concern that inaccurate matches could lead to unjust arrests. In January 2019, a study by a team of researchers at MIT and the University of Toronto found that Rekognition had been misidentifying women, especially those with darker skin.
The company has since resisted calls to halt deployment of the software, stating its tools were accurate but were improperly used by researchers.
Rekognition is a Cloud-based computer vision technology that enables end customers to match photos based on visual similarities. It is said to recognize random objects such as dogs and chairs, but can also be used to compare human faces.
The software has been used by local police departments in Florida and Oregon. The city of Orlando has used it to scan faces via body cameras and video surveillance cameras around the city in real-time; while in Oregon, Washington County deputies have scanned mug shot photos against images obtained from surveillance or other sources.
The American Civil Liberties Union two years ago exposed Amazon for secretly selling its technology to police. Since then the organization has called on the company to ban the sale of its facial recognition technology to law enforcement.
On Wednesday, Nicole Ozer, technology and civil liberties director of the ACLU of Northern California, stated the organization was “glad the company is finally recognizing the dangers face recognition poses to Black and Brown communities and civil rights more broadly.”
Related: Facial Recognition Has a COVID-19 Problem
“This surveillance technology’s threat to our civil rights and civil liberties will not disappear in a year,” Ozer said. “Amazon must fully commit to a blanket moratorium on law enforcement use of face recognition until the dangers can be fully addressed, and it must press Congress and legislatures across the country to do the same. They should also commit to stop selling surveillance systems like Ring that fuel the over-policing of communities of color.”
Amazon’s decision to place a moratorium on the software follows a similar decision by IBM. On Monday, IBM CEO Arvind Krishna wrote in a letter to Congress the company would no longer provide facial recognition technology to police departments for mass surveillance and racial profiling.
This article originally ran in CS sister publication Security Sales & Integration. Rodney Bosch is SSI’s senior editor.
If you enjoyed this article and want to receive more valuable industry content like this, click here to sign up for our digital newsletters!
Leave a Reply