Privacy campaigners and politicians have stated that the police should stop using facial recognition technology as a method of public surveillance.
Those in favour of the technology have argued that it protects the public, and helps identify criminals and local suspects, however privacy campaigners have disagreed stating that the technology is inaccurate and incompatible with human rights.
A letter written by Big Brother Watch, argues that facial recognition technology is being implemented around the UK without being properly criticised by politicians. The letter has been signed by more than 18 politicians, including Diane Abbott, Jo Swinson and Caroline Lucas. Amnesty International and another 24 campaign groups have also signed it.
Silkie Carlo, director of Big Brother Watch, explained to the BBC Victoria Derbyshire programme, that there is a “surveillance crisis” which urgently needs to be stopped. The point of the letter is to get the government’s attention: “What we’re doing is putting this to government to say: Please can we open this debate and have this conversation.”
Recently it was revealed that facial recognition technology was being utilised in the Kings Cross estate without the public knowing. Soon later it emerged that the British Transport Police and the Metropolitan Police had supplied images for a database to carry out the facial scans.
Numerous times, privacy advocates have stressed how the technology is vulnerable to bias, and is more likely to wrongly identify people of colour, and women. This is explained by some systems not being trained with enough diverse datasets.
Areeq Chowdhury, head of Future Advocacy said: “You could see a situation where you are identifying innocent individuals who are from a particular minority. Which means they’ll be questioned by the police even though they’re innocent and they may even have their details and picture captured on record, despite having committed no crime.”
However those in support of the technology have stated how it would help policing.
Zak Doffman, CEO of Digital Barriers said: “Imagine I know there is a group of individuals in central London that want to do harm on a massive scale to the public. Would you have public support to use facial recognition to try and intercept that group of individuals before they can do harm? I would suggest almost categorically you would.
“I’ll give you the opposite example, an individual has been kicked out of the pub for drinking too much on a Saturday night. The pub has taken a photo of that individual, should they then be prevented from getting into that establishment or other establishments because of that incident? I think you’ll have very little public consent for that example.
“Unfortunately there’s no clarity. There’s no regulation that governs either case and that is the challenge.”
Doffman did stress that he did not support the “indiscriminate” use of the technology.
It is evident that strict regulations on how the technology is utilised needs to enforced and adopted by police forces.
Tony Porter, UK Surveillance Camera Commissioner commented: “There should be a standard around its siting, efficiency and effectiveness.”
The post #privacy: Facial recognition surveillance needs to stop appeared first on PrivSec Report.