This article is from the source 'guardian' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at https://www.theguardian.com/technology/2019/sep/04/police-use-of-facial-recognition-is-legal-cardiff-high-court-rules

The article has changed 3 times. There is an RSS feed of changes available.

Version 0 Version 1
Police use of facial recognition is legal, Cardiff high court rules Police use of facial recognition is legal, Cardiff high court rules
(about 1 hour later)
Police use of automatic facial recognition technology to search for people in crowds is lawful, the high court in Cardiff has ruled.Police use of automatic facial recognition technology to search for people in crowds is lawful, the high court in Cardiff has ruled.
Although the mass surveillance system interferes with the privacy rights of those scanned by security cameras, a judge has concluded, it is not illegal. Although the mass surveillance system interferes with the privacy rights of those scanned by security cameras, two judges have concluded, it is not illegal.
The legal challenge was brought by Ed Bridges, a former Liberal Democrat councillor from Cardiff, who noticed the cameras when he went out to buy a lunchtime sandwich. He was supported by the human rights organisation Liberty. The legal challenge was brought by Ed Bridges, a former Liberal Democrat councillor from Cardiff, who noticed the cameras when he went out to buy a lunchtime sandwich. He was supported by the human rights organisation Liberty. He plans to appeal against the judgment.
Facial recognition technology scrapped at King's Cross siteFacial recognition technology scrapped at King's Cross site
Bridges said he was distressed by police use of the technology, which he believes captured his image while out shopping and later at a peaceful protest against the arms trade.Bridges said he was distressed by police use of the technology, which he believes captured his image while out shopping and later at a peaceful protest against the arms trade.
During the three-day hearing in May, his lawyers alleged the surveillance operation breached data protection and equality laws.During the three-day hearing in May, his lawyers alleged the surveillance operation breached data protection and equality laws.
The judges found that although automated facial recognition (AFR) amounted to interference with privacy rights, there was a lawful basis for it and the legal framework used by the police was proportionate.The judges found that although automated facial recognition (AFR) amounted to interference with privacy rights, there was a lawful basis for it and the legal framework used by the police was proportionate.
Fiona Barton QC, a barrister at 5 Essex Court chambers who specialises in police use of technology, said: “This case should not be taken as a green light to go ahead with the use of AFR in all and any circumstances: it was decided on specific facts, within a specific legal framework applicable to certain public authorities, and by reference to South Wales police’s policy and other documents. Dismissing the challenge, Lord Justice Haddon-Cave, sitting with Mr Justice Swift, said: “We are satisfied both that the current legal regime is adequate to ensure appropriate and non-arbitrary use of AFR Locate, and that South Wales police’s use to date of AFR Locate has been consistent with the requirements of the Human Rights Act and the data protection legislation.”
“There is much discussion to be had and advice to be given in relation to specific aspects of how businesses might lawfully employ AFR. It is not a matter of one size fits all.”
Responding to the judgment, Megan Goulding, a Liberty lawyer, said: “This disappointing judgment does not reflect the very serious threat that facial recognition poses to our rights and freedoms. Facial recognition is a highly intrusive surveillance technology that allows the police to monitor and track us all.Responding to the judgment, Megan Goulding, a Liberty lawyer, said: “This disappointing judgment does not reflect the very serious threat that facial recognition poses to our rights and freedoms. Facial recognition is a highly intrusive surveillance technology that allows the police to monitor and track us all.
“It is time that the government recognised the danger this dystopian technology presents to our democratic values and banned its use. Facial recognition has no place on our streets.”“It is time that the government recognised the danger this dystopian technology presents to our democratic values and banned its use. Facial recognition has no place on our streets.”
Bridges said: “South Wales police has been using facial recognition indiscriminately against thousands of innocent people, without our knowledge or consent. This sinister technology undermines our privacy and I will continue to fight against its unlawful use to ensure our rights are protected and we are free from disproportionate government surveillance.”Bridges said: “South Wales police has been using facial recognition indiscriminately against thousands of innocent people, without our knowledge or consent. This sinister technology undermines our privacy and I will continue to fight against its unlawful use to ensure our rights are protected and we are free from disproportionate government surveillance.”
Automated facial recognition (AFR) is technology that can identify people by analysing and comparing facial features to those held in a database. Facial recognition technology maps faces in a crowd and compares them to a watch list of images, which can include suspects, missing people and persons of interest to the police.
You might recognise it from auto-tagging of pictures on Facebook or on your phone, but it is increasingly being used out in the real world.
Shoppers at retail parks such as Westfield, for example, are routinely scanned and recorded by dozens of hidden cameras built into the centres’ digital advertising billboards. The cameras can determine not only your age and gender, but your mood, cueing up tailored advertisements within seconds, thanks to facial detection technology.
British police have also used the technology to scan crowds at events and demonstrations to identify ‘people of interest’.
In the UK a court action claims that south Wales police violated privacy and data protection rights by using facial recognition technology on individuals. The police force defended their actions saying that AFR was similar to the use of DNA to solve crimes and would have little impact on those who were not suspects. 
The UK’s biometrics commissioner has warned that police forces are pushing ahead with the use of AFR systems in the absence of clear laws on whether, when or how the technology should be employed.
The pressure group Liberty has denounced AFR as 'arsenic in the water supply of democracy', and the city of San Francisco has already barred the use of automatic facial recognition by law enforcement.
A crucial argument against police’s deployment of the technology is that it doesn’t yet work very well. It is especially inaccurate and prone to bias when used against people of colour: a test of Amazon’s facial recognition software found that it falsely identified 28 members of US Congress as known criminals, with members of the Congressional Black Caucus disproportionately represented.
Facial recognition technology maps faces in a crowd and then compares them to a watch list of images, which can include suspects, missing people and persons of interest to the police.
The cameras scan faces in large crowds in public places such as streets, shopping centres, football crowds and music events such as the Notting Hill carnival.The cameras scan faces in large crowds in public places such as streets, shopping centres, football crowds and music events such as the Notting Hill carnival.
Three UK forces have used facial recognition in public spaces since June 2015: the Metropolitan, Leicestershire and South Wales police. Three UK forces have used facial recognition in public spaces since June 2015: the Met, Leicestershire and South Wales police.
Lawyers for South Wales police told the hearing facial recognition cameras prevented crime, protected the public and did not breach the privacy of innocent people whose images were captured.Lawyers for South Wales police told the hearing facial recognition cameras prevented crime, protected the public and did not breach the privacy of innocent people whose images were captured.
The technology was likened to police use of DNA. Those not on a watch list would not have their data stored after being scanned by AFR cameras, the court was told.The technology was likened to police use of DNA. Those not on a watch list would not have their data stored after being scanned by AFR cameras, the court was told.
Dismissing the challenge, Lord Justice Haddon-Cave, sitting with Mr Justice Swift, said: “We are satisfied both that the current legal regime is adequate to ensure appropriate and non-arbitrary use of AFR Locate, and that South Wales police’s use to date of AFR Locate has been consistent with the requirements of the Human Rights Act and the data protection legislation.” Fiona Barton QC, a barrister at 5 Essex Court chambers who specialises in police use of technology, said: “This case should not be taken as a green light to go ahead with the use of AFR in all and any circumstances: it was decided on specific facts, within a specific legal framework applicable to certain public authorities, and by reference to South Wales police’s policy and other documents.”
The chief constable of South Wales police, Matt Jukes, said: “I recognise that the use of AI and face-matching technologies around the world is of great interest and, at times, concern. So, I’m pleased that the court has recognised the responsibility that South Wales Police has shown in our programme.
“There is, and should be, a political and public debate about wider questions of privacy and security. It would be wrong in principle for the police to set the bounds of our use of new technology for ourselves.”
A spokeswoman for the Information Commissioners’ Office, which intervened in the case, said: “We welcome the court’s finding that the police use of live facial recognition systems involves the processing of sensitive personal data of members of the public, requiring compliance with the Data Protection Act 2018.A spokeswoman for the Information Commissioners’ Office, which intervened in the case, said: “We welcome the court’s finding that the police use of live facial recognition systems involves the processing of sensitive personal data of members of the public, requiring compliance with the Data Protection Act 2018.
“This new and intrusive technology has the potential, if used without the right privacy safeguards, to undermine rather than enhance confidence in the police. “This new and intrusive technology has the potential, if used without the right privacy safeguards, to undermine rather than enhance confidence in the police.”
“Our investigation into the first police pilots of this technology has recently finished. We will now consider the court’s findings in finalising our recommendations and guidance to police forces about how to plan, authorise and deploy any future LFR systems.” A survey of more than 4,000 adults released on Wednesday by the Ada Lovelace Institute found that a majority (55%) want the government to impose restrictions on police use of facial recognition technology but that nearly half (49%) support use of facial recognition technology in day to day policing, assuming appropriate safeguards are in place.
Carly Kind, the director of the Ada Lovelace Institute, said: “The fact that the deployment of a new technology, with which a significant proportion of the public are not comfortable, can be deemed technically compliant underlines the need for an informed public discourse on what new restrictions and safeguards are needed. Facial recognition technology may be lawful, but that does not mean its use is ethical, especially outside of the very limited circumstances examined by the court in this case.”
Facial recognitionFacial recognition
SurveillanceSurveillance
CardiffCardiff
WalesWales
Liberal DemocratsLiberal Democrats
PrivacyPrivacy
newsnews
Share on FacebookShare on Facebook
Share on TwitterShare on Twitter
Share via EmailShare via Email
Share on LinkedInShare on LinkedIn
Share on PinterestShare on Pinterest
Share on WhatsAppShare on WhatsApp
Share on MessengerShare on Messenger
Reuse this contentReuse this content