The Scary Thing Amazon's Facial Recognition Can Do
In 2017, Amazon officially announced three new features to its "Amazon Rekognition" software package. The software was launched the previous year with the promise to dramatically increase developers' use of machine learning in the analysis of digital images. The new features included "detection and recognition of text in images, real-time face recognition across tens of millions of faces, and detection of up to 100 faces in challenging crowded photos," per the press release.
The breadth of the new features demonstrated how far machine learning had come in just a few years, and the release merely touched the surface of the widespread applicability of Amazon's ever-improving software. Describing the technology's immediate impact in the battle to end human trafficking, as well as its use by social media sites such as Pinterest as a way of extracting "rich text" and thereby improving the cataloging of users' images, Amazon seemed keen to telegraph how widely Rekognition was already used.
But in 2018, it became clear that Amazon hadn't been completely transparent about some of the customers that it was looking to attract. As reported by the American Civil Liberties Union (ACLU) — who had obtained leaked documents from three U.S. states — Amazon had quietly been marketing Rekognition to law enforcement agencies as a way of quickly and easily identifying "people of interest" in live footage, expressly recommending the technology's deployment in the analysis of police bodycam footage.
Is Amazon's Rekognition software dangerous?
The ACLU pulled no punches in their criticism of Amazon's attempt to convince American law enforcement to begin experimenting with Rekognition as a surveillance tool. "With this technology, police would be able to determine who attends protests. ICE could seek to continuously monitor immigrants as they embark on new lives. Cities might routinely track their own residents, whether they have reason to suspect criminal activity or not ... these systems are certain to be disproportionately aimed at minority communities," according to ACLU experts Matt Cagle and Nicole Orzer, who authored the report.
The final point was seemingly proved by tests that showed that Rekognition disproportionately misidentified people of color, particularly black women, a troubling revelation that experts agreed was the result of bias in the data sets through which the software was developed (per The Verge).
Giving an indication of how controversial the technology had become, Cnet reported that Amazon's own employees began to protest the company's eagerness to provide law enforcement with access to Rekognition given what they perceived at the time as the U.S. government's "immoral" immigration policies, and the possibility that the technology may be abused.
Per The Verge, Amazon informed employees that it would continue marketing Rekognition to law enforcement. However, following the killing of George Floyd in 2020, the company seemed to finally acknowledge that Rekognition could very well be dangerous when it announced a one-year moratorium on the selling of the software to law enforcement.
"We've advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology ... We hope this one-year moratorium might give Congress enough time to implement appropriate rules," an Amazon press release said.