A company using live facial recognition software to scan hundreds of thousands of unwitting people in London is under investigation. Developers Argent admitted using facial recognition to monitor the development following an investigation by the Financial Timeswhich reported that the technology could also be introduced in the Canary Wharf finance district. A report by the London Policing Ethnics Panel concluded that the technology must not be used on the general public unless it could be proven to have a significant impact positive impact that outweighs privacy issues.
Update : May Despite the evidence that live facial recognition cameras pose a threat to our privacy, freedom of expression, and right of association, the Metropolitan Police are continuing to use this authoritarian technology in public spaces. Thank you!
Keep abreast of significant corporate, financial and political developments around the world. Stay informed and spot emerging risks and opportunities with independent global reporting, expert commentary and analysis you can trust. By subscribing with Google you will be billed at a price in your local currency. Sign in.
Facial recognition is a dangerously intrusive and discriminatory technology that destroys our privacy rights and forces people to change their behaviour. Facial recognition works by matching the faces of people walking past special cameras to images of people on a watch list. The technology does this by scanning the distinct points of our faces and creating uniquely identifiable biometric maps — more like a fingerprint than a photograph.
Britain has a close relationship with security cameras. London alone has one of the highest ratios of surveillance cameras per citizen in the developed world. Estimates from put the number of surveillance cameras in Greater London at more than ,; around are used by the City of London Police, according to data obtained through a Freedom of Information request.
Automated Facial Recognition is software that can automatically detect faces in an image or video and compare with a database of facial images. AFR can not be used to identify persons unless they are in a watchlist. AFR greatly assists us by allowing resources to be deployed elsewhere in protecting our communities.
Last year, the Trafford Centre in Manchester was pressured to stop using live facial recognition after six months of monitoring visitors following an intervention by the surveillance camera commissioner, Tony Porter. Facial recognition is not currently governed by a specific legal framework in the UK, meaning that private companies can start using it without declaring the move publicly or notifying authorities. The technology is regulated by privacy laws and the Data Protection Actwhich gives anyone scanned the right to be informed about how their image has been collected and used. A spokesperson for British Land, which owns Meadowhall, said it does not currently operate facial recognition technology on any of its sites.
By New Scientist staff and Press Association. The first legal battle in the UK over police use of face recognition technology will begin today. Ed Bridges has crowdfunded action against South Wales Police over claims that the use of the technology on him was an unlawful violation of privacy.
Facial recognition technology is mistakenly targeting four out of five innocent people as wanted suspects, according to findings from the University of Essex. The report -- which was commissioned by Scotland Yard -- found that the technology used by the UK's Metropolitan Police is 81 percent inaccurate and concludes that it is "highly possible" the system would be found unlawful if challenged in court. The report, obtained by Sky Newsis the first independent evaluation of the scheme since the technology was first used at Notting Hill Carnival in August