Generative Data Intelligence

Amazon Introduces 12-Months Moratorium on Using Facial Recognition Tech by Law Enforcement

Date:

Amazon’s controversial face recognition tech Rekognition will not be available to use by law enforcement, the company announced on June 10th. The moratorium is year-long and was reportedly introduced to give regulators time to come up with solid ethical rules of using the tech for policing.

Rekognition is Amazon’s advanced AI-based computer vision tool for photo and video analysis released in November 2016. It is a flexible technology that can be used for tasks like face recognition, sentiment analysis, text analysis, object pathing in video frames, and such. Microsoft, IBM, and Google have their own computer vision systems, so do a number of smaller narrower-focus companies. The capabilities of Rekognition and similar systems are fascinating, but there are technical and ethical challenges that become especially apparent in law enforcement applications.

Importantly, automatic recognition systems are prone to biases and mistakes. As shown in the 2018 Gender Shades study by MIT researchers Joy Buolamwini and Timnit Gebru, the accuracy of face recognition results depends greatly on the race and gender of the person in analyzed photos. Initially, the researchers have compared solutions from Face++, IBM, and Microsoft, then followed the paper up with data on  Kairos, and Amazon Rekognition systems in January 2019. Both parts of the study highlighted the fact that, although improving over time, face recognition solutions perform best on white male faces, struggling to accurately identify photos of females with darker faces.

In another 2018 test, this time by the ACLU, Rekognition was tasked with matching photos of the U.S. Congress members with 25,000 of publicly available mugshots. As a result, the system found 28 matches, where it should’ve found none. Again, people of color were disproportionately misidentified comprising 38% of false matches, while representing only 20% of Congress.

In 2020, a similar test by Comparitech involving 430 Representatives and 100 Senators in the U.S. dataset. In the test, the Rekognition system was outputting 32 incorrect matches on average at an 80% confidence threshold.

Given the racial bias issues associated with law enforcement, these flaws of face recognition tech have great potential to cause harm and ultimately widen the gap between police forces and the communities they ought to protect.

Concerns about the use of Rekognition by police departments across the U.S. have been around at least since 2018. Back then, the ACLU found documents indicating that since 2017 Washington County sheriff and the city of Orlando have been using the technology to match photos and videos of suspects with mugshot databases.

Washington County Sheriff’s Office listing among Rekognition customers

Washington County Sheriff’s Office listing among Rekognition customers. Source: Amazon

Later in 2018, pushed by the tech’s potential to cause privacy and human rights issues, Amazon’s own shareholders and employees called against marketing Rekognition to government agencies like ICE and DHS.

“In the face of this immoral U.S. policy, and the U.S.’s increasingly inhumane treatment of refugees and immigrants beyond this specific policy, we are deeply concerned that Amazon is implicated, providing infrastructure and services that enable ICE and DHS,” an internal letter from Amazon employees to Jeff Bezos said.

Unfortunately, there is no concrete number of law enforcement agencies in the U.S. using Recognition. Even Andy Jassy, CEO of Amazon Web Services, said he doesn’t know for sure:

“I don’t think we know the total number of police departments that are using [Amazon’s] facial recognition technology. We have 165 services in our technology infrastructure platform, and you can use them in any combination you want.”

Now, in June 2020, while the U.S. is engulfed in protests against systemic misconduct and racial bias on the part of law enforcement, Amazon puts Rekognition use by police on hold with a year-long moratorium. The single reason cited in the announcement is to give the regulators time to introduce appropriate rules.

“We’ve advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge. We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested,” the company’s announcement reads.

While barring the police from using Rekognition, Amazon will continue to provide the service for humanitarian organizations:

“We will continue to allow organizations like Thorn, the International Center for Missing and Exploited Children, and Marinus Analytics to use Amazon Rekognition to help rescue human trafficking victims and reunite missing children with their families,” the announcement elaborates.

Notably, two days prior to Amazon’s announcement, IBM decided to shut down its face recognition development because of poor accuracy and ethical concerns emphasized by the ongoing situation in the U.S.

Follow us on Twitter and Facebook and join our Telegram channel to know what’s up with crypto and why it’s important.

Subscribe to our Newsletter

<<aside id=”unisender_subscribe_form-7″ class=”widget unisender_form”>

Source: https://forklog.media/amazon-introduces-12-months-moratorium-on-using-facial-recognition-tech-by-law-enforcement/

spot_img

Latest Intelligence

spot_img

Chat with us

Hi there! How can I help you?