This Sept. 6, 2012, file photo shows the Amazon logo in Santa Monica, Calif. Amazon said Wednesday, June 10, 2020, that it will pause police use of its facial recognition technology for a year. The Seattle-based company did not say why it was doing so, but protests after the death of George Floyd have focused attention on racial injustice in the U.S. and how police use technology to track people.
Amazon on Wednesday banned police use of its face-recognition technology for a year, making it the latest tech giant to step back from law-enforcement use of systems that have been criticized for incorrectly identifying people with darker skin.
The Seattle-based company did not say why it took action now. Ongoing protests following the death of George Floyd have focused attention on racial injustice in the U.S. and how police use technology to track people. Floyd died May 25 after a white Minneapolis police officer pressed his knee into the handcuffed black man’s neck for several minutes even after Floyd stopped moving and pleading for air.
Law enforcement agencies use facial recognition to identify suspects, but critics say it can be misused. A number of U.S. cities have banned its use by police and other government agencies, led by San Francisco last year.
On Tuesday, IBM said it would get out of the facial recognition business, noting concerns about how the technology can be used for mass surveillance and racial profiling.
Civil rights groups and Amazon's own employees have pushed the company to stop selling its technology, called Rekognition, to government agencies, saying that it could be used to invade people’s privacy and target minorities.
In a blog post Wednesday, Amazon said that it hoped Congress would put in place stronger regulations for facial recognition.
“Amazon’s decision is an important symbolic step, but this doesn’t really change the face recognition landscape in the United States since it’s not a major player,” said Clare Garvie, a researcher at Georgetown University’s Center on Privacy and Technology. Her public records research found only two U.S. two agencies using or testing Rekognition. The Washington County Sheriff’s Office in Oregon has been the most public about using it. The Orlando police department tested it, but chose not to implement it, she said.
Studies led by MIT researcher Joy Buolamwini found racial and gender disparities in facial recognition software. Those findings spurred Microsoft and IBM to improve their systems, but irked Amazon, which last year publicly attacked her research methods. A group of artificial intelligence scholars, including a winner of computer science’s top prize, last year launched a spirited defense of her work and called on Amazon to stop selling its facial recognition software to police.
Buolamwini on Wednesday called Amazon’s announcement a “welcomed though unexpected announcement.”
“Microsoft also needs to take a stand,” she wrote in an emailed statement. “More importantly our lawmakers need to step up” to rein in harmful deployments of the technologies.
Microsoft has been vocal about the need to regulate facial recognition to prevent human rights abuses but hasn’t said it wouldn’t sell it to law enforcement. The company didn't respond to a request for comment Wednesday.
Amazon has attracted outsize attention since it introduced Rekognition in 2016 and began pitching to law enforcement. Many U.S. agencies rely on facial recognition technology built by companies that are not as well known, such as Tokyo-based NEC or the European companies Idemia and Cognitec, Garvie said.
Amazon said organizations, such as those that use Rekognition to help find missing children, will still have access to the technology.