Facial recognition fails accuracy test raises privacy concerns; ACLU sues Clearview AI

Existing criticisms of facial recognition technology once again is being called into question as news of Amazon’s “Rekognition” software was found to incorrectly match 105 U.S. and U.K. politicians.

A blog post by privacy advocate Paul Bischoff published on

May 28 criticized the tool for being inaccurate after he compared new data from Comparitech with the results of a July 2018 test by the American Civil Liberties Union (ACLU) that checked actual mugshots with congressmen. Meanwhile, it’s being “peddled” as a recommended measure to U.S. law enforcement agencies.

“The ACLU found 28 false matches, highlighting the shortcomings of face recognition technology that’s being peddled to law enforcement agencies nationwide,” Bischoff noted, adding that the technology has not improved much, according to Comparitech’s latest experiment almost two years later.

The new study added British lawmakers for a new subject pool totaling 1,959 elected officials (530 U.S. congressmen and 1,429 members of the U.K. Parliament and House of Lords) among 25,000 police arrest photos. The ACLU, in its earlier rebuke, noted Rekognition allows for leeway in default settings that provide in this instance a variance in confidence setting at 80 percent when Amazon Web Services posted on a blog that the threshold should be at 99 percent. Meanwhile, local police reportedly set the threshold and are under no obligation to use Amazon’s recommendation.

Bischoff said the end result is a battery of false positives, racially biased misidentifications with non-whites matching mug shots at a higher rate than whites. 

Meanwhile, the ACLU filed a lawsuit on May 28 with other privacy advocates against facial technology Clearview AI, aiming to bring an end its surveillance activities. The lawsuit is the first to force any face recognition surveillance company to answer directly to groups representing survivors of domestic violence and sexual assault, undocumented immigrants, and other vulnerable communities uniquely harmed by face recognition surveillance.

The lawsuit was filed in Illinois state court on behalf of the ACLU, the ACLU of Illinois, the Chicago Alliance Against Sexual Exploitation, the Sex Workers Outreach Project, the Illinois State Public Interest Research Group (PIRG), and Mujeres Latinas en Acción.

The groups argue that Clearview AI violated — and continues to violate — the privacy rights of Illinois residents under the Illinois Biometric Information Privacy Act (BIPA).

“Clearview has violated the rights of all Illinoisans, but the harms from this technology are not shared equally across our state,” said Linda Xóchitl Tortolero, president and CEO of Mujeres Latinas en Acción, a non-profit organization dedicated to empowering Latinas, including by providing services to survivors of domestic violence and sexual assault, and undocumented immigrants.

Using face recognition technology, Clearview has captured more than three billion faceprints from images available online, all without the knowledge — much less the consent — of those pictured, according to the ACLU.

“Companies like Clearview will end privacy as we know it, and must be stopped,” stated Nathan Freed Wessler, senior staff attorney with the ACLU’s Speech, Privacy, and Technology Project. “This menacing technology gives governments, companies, and individuals the unprecedented power to spy on us wherever we go — tracking our faces at protests, AA meetings, political rallies, places of worship, and more,” he added.

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms and Conditions and Privacy Policy.