Fast company logo
|
advertisement

Of the 104 alerts triggered by the facial recognition tech, only 2 of them were accurate.

Report: facial recognition software inaccurate in up to 98% of cases

[Photo: rawpixel]

BY Michael Grothaus

The Independent has published figures it gained under a freedom of information request that shows that the facial recognition software that U.K. police forces are trialing are mostly inaccurate. The figures show that of the 104 alerts triggered by the facial recognition tech the U.K.’s Metropolitan Police use, only 2 of them were accurate. That’s a 98% false-positive rate. Another system used by South Wales Police over a 15-day period returned more than 2,400 false positives with only 234 alerts being correct matches–a false-positive rate of almost 90%. The news will bolster privacy campaigners who say the use of such systems is Orwellian and has the potential to cause more harm to citizens than good.

advertisement

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

CoDesign Newsletter logo
The latest innovations in design brought to you every weekday.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Michael Grothaus is a novelist and author. He has written for Fast Company since 2013, where he's interviewed some of the tech industry’s most prominent leaders and writes about everything from Apple and artificial intelligence to the effects of technology on individuals and society. More


Explore Topics