Amazon’s Rekognition feels the heat

Amazon’s Rekognition feels the heat
In the United States, a civil rights organization is trying to put an end to the use of Rekognition by authorities (including Police). Rekognition is Amazon’s facial recognition tool. Besides violating civil rights, the app is feared to be dysfunctional. Indeed, Rekognition has identified 28 members of Congress as criminals…
The American Civil Liberties Union (ACLU) is not letting go: the organization takes a stand to ban the use of Rekognition by US authorities, especially Police. The first claim was it violated civil rights.

28 members of Congress out of 535 spotted as criminals by Rekognition!

But a test carried out by ACLU brought water to the mill: Rekognition doesn’t work.
The organization tested the system with the pictures of the 535 members of the US Congress. It then compared those pictures to 25,000 mugshots.
ACLU used Rekognition’s default parameters and left it to scan for matches. Quite surprisingly, according to the system’s artificial intelligence, 28 members of Congress were matched to pictures of criminals.

Risk of misuse by law enforcement 

« The members of Congress who were falsely matched with the mugshot database we used in the test include Republicans and Democrats, men and women, and legislators of all ages, from all across the country. » says ACLU
The organization could then issue a warning against the use of Rekognition by Police with this very concrete example. « If law enforcement is using Amazon Rekognition, it’s not hard to imagine a police officer getting a “match” indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins. Or an individual getting a knock on the door from law enforcement, and being questioned or having their home searched, based on a false identification. » says ACLU.

Amazon argues the parameters were wrong…

Amazon’s response was quick and reside in two numbers: 80% and 95%. The default parameters for Rekognition have a confidence level of 80% (meaning the AI decides it’s a match if 80% of features are identical). This is trustworthy for animals and objects.
For sensitive data like human faces, Amazon says Rekognition must be adjusted to 95%. ACLU then reacted by pointing out there was no warning upon installation. Nothing to prevent law enforcement from using the wrong parameters, deliberately or not.

… and that does not ease ACLU’s concerns, determined to go the whole way!

For the organization, this little experiment displays the limits of the tool and the danger in letting law enforcement use it. ACLU points out that if the government has recommended the use of Rekognition, it’s because it is cheap (under $12/month).
The risk that Police (or other people having access) turn this into a surveillance tool, disregarding private life, is sufficient to demand its removal, according to ACLU. All the more if badly developed parameters turn respectable senators into dangerous suspects…