FACIAL RECOGNITION WRONGLY IDENTIFIES PUBLIC AS POTENTIAL CRIMINALS 96% OF TIME

May 8, 2019 in News by RBN Staff

Published: May 7, 2019

SOURCE: INDEPENDENT UK

<i>Facial Recognition Art Mural by <a href='https://www.flickr.com/photos/yowhathappenedtopeace/'>Yo What Happened To Peace</a> on <a href='https://www.flickr.com/photos/yowhathappenedtopeace/'>Flickr</a></i>
Facial Recognition Art Mural by Yo What Happened To Peace on Flickr

Facial recognition technology has misidentified members of the public as potential criminals in 96 per cent of scans so far in London, new figures reveal.

The Metropolitan Police said the controversial software could help it hunt down wanted offenders and reduce violence, but critics have accused it of wasting public money and violating human rights.

The trials have so far cost more than £222,000 in London and are subject to a legal challenge and a separate probe by the Information Commissioner.

Eight trials carried in London between 2016 and 2018 resulted in a 96 per cent rate of “false positives” – where software wrongly alerts police that a person passing through the scanning area matches a photo on the database.

Two deployments outside the Westfield in shopping centre in Stratford last year saw a 100 per cent failure rate and monitors said a 14-year-old black schoolboy was fingerprinted after being misidentified.

READ MORE…