This article is from the source 'guardian' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at https://www.theguardian.com/uk-news/2019/jul/12/police-trials-facial-recognition-home-secretary-sajid-javid-technology-human-rights

The article has changed 5 times. There is an RSS feed of changes available.

Version 3 Version 4
Police trials of facial recognition backed by home secretary Police trials of facial recognition backed by home secretary
(8 days later)
The home secretary, Sajid Javid, has thrown his support behind police trials of controversial facial recognition technology.The home secretary, Sajid Javid, has thrown his support behind police trials of controversial facial recognition technology.
The Neoface system used by the Metropolitan police and South Wales police is supplied by the Japanese company NEC, which markets the same technology to retailers and casinos to spot regular customers, and to stadium and concert operators to scan crowds for “potential troublemakers”.The Neoface system used by the Metropolitan police and South Wales police is supplied by the Japanese company NEC, which markets the same technology to retailers and casinos to spot regular customers, and to stadium and concert operators to scan crowds for “potential troublemakers”.
The technology and its use by police has met considerable criticism. Its use by South Wales police is under judicial review, while the information commissioner, Elizabeth Denham, has criticised “a lack of transparency about its use”. Tony Porter, the surveillance camera commissioner, last year intervened to stop Greater Manchester police using facial recognition at the Trafford shopping centre.The technology and its use by police has met considerable criticism. Its use by South Wales police is under judicial review, while the information commissioner, Elizabeth Denham, has criticised “a lack of transparency about its use”. Tony Porter, the surveillance camera commissioner, last year intervened to stop Greater Manchester police using facial recognition at the Trafford shopping centre.
This month, University of Essex researchers who were given access to six live trials by the Met found matches were correct in only a fifth of cases and the system was likely to break human rights laws.This month, University of Essex researchers who were given access to six live trials by the Met found matches were correct in only a fifth of cases and the system was likely to break human rights laws.
The BBC reported that Javid supported the trials at the launch of computer technology aimed at helping police fight online child abuse.The BBC reported that Javid supported the trials at the launch of computer technology aimed at helping police fight online child abuse.
“I back the police in looking at technology and trialling it and … different types of facial recognition technology is being trialled especially by the Met at the moment and I think it’s right they look at that,” he said.“I back the police in looking at technology and trialling it and … different types of facial recognition technology is being trialled especially by the Met at the moment and I think it’s right they look at that,” he said.
The civil rights campaign group Liberty has previously called facial recognition “a dangerously intrusive and discriminatory technology that destroys our privacy rights and forces people to change their behaviour”.The civil rights campaign group Liberty has previously called facial recognition “a dangerously intrusive and discriminatory technology that destroys our privacy rights and forces people to change their behaviour”.
Automated facial recognition (AFR) is technology that can identify people by analysing and comparing facial features to those held in a database. This is a catch-all term for any technology that involves cataloguing and recognising human faces, typically by recording the unique ratios between an individual’s facial features, such as eyes, nose and mouth. 
You might recognise it from auto-tagging of pictures on Facebook or on your phone, but it is increasingly being used out in the real world. After a trial of the technology, London's Metropolitan police have said they will start to use it in London within a month. On Friday, the force said it would be used to find suspects on “watchlists” for serious and violent crime, as well as to help find children and vulnerable people. Scotland Yard said the public would be aware of the surveillance, with the cameras being placed in open locations and officers handing out explanatory leaflets.
Shoppers at retail parks such as Westfield in London, for example, are routinely scanned and recorded by dozens of hidden cameras built into the centres’ digital advertising billboards. The cameras can determine not only your age and gender, but your mood, cueing up tailored advertisements within seconds, thanks to facial detection technology. The technology greatly improves the power of surveillance. At the simple end, a facial recognition system connected to a network of cameras can automatically track an individual as they move in and out of coverage, even if no other information is known about them. At the more complex end, a facial recognition system fuelled by a large database of labelled data can enable police to pinpoint a person of interest across a city of networked cameras.
British police have also used the technology to scan crowds at events and demonstrations to identify ‘people of interest’. Facial recognition frequently sparks two distinct fears: that it will not work well enough, or that it will work too well.
In the UK a court action claims that south Wales police violated privacy and data protection rights by using facial recognition technology on individuals. The police force defended their actions saying that AFR was similar to the use of DNA to solve crimes and would have little impact on those who were not suspects.  The first concern highlights the fact that the technology, still in its infancy, is prone to false positives and false negatives, particularly when used with noisy imagery, such as that harvested from CCTV cameras installed years or decades ago. When that technology is used to arrest, convict or imprison people, on a possibly faulty basis, it can cause real harm. Worse, the errors are not evenly distributed; facial recognition systems have regularly been found to be inaccurate at identifying people with darker skin.
The UK’s biometrics commissioner has warned that police forces are pushing ahead with the use of AFR systems in the absence of clear laws on whether, when or how the technology should be employed. But the technology will improve, meaning the second concern is harder to shake. This is the fear that facial recognition inherently undermines freedom by enabling perfect surveillance of everyone, all the time. The fear is not hypothetical; already, Chinese cities have proudly used the technology to publicly shame citizens for jaywalking, or leaving the house in their pyjamas.
The pressure group Liberty has denounced AFR as 'arsenic in the water supply of democracy', and the city of San Francisco has already barred the use of automatic facial recognition by law enforcement. Alex Hern Technology editor
A crucial argument against police’s deployment of the technology is that it doesn’t yet work very well. It is especially inaccurate and prone to bias when used against people of colour: a test of Amazon’s facial recognition software found that it falsely identified 28 members of US Congress as known criminals, with members of the Congressional Black Caucus disproportionately represented.
There was anger in January 2020 among football supporters and civil rights activists after the technology was used to scan the crowd at the Cardiff City-Swansea City derby.
The Home Office said it believed there was an adequate legal framework for its use and it supported police trials, but added it was reviewing ways to simplify and extend governance and oversight of biometrics.The Home Office said it believed there was an adequate legal framework for its use and it supported police trials, but added it was reviewing ways to simplify and extend governance and oversight of biometrics.
Javid said police would be given “game-changing” technological tools to bolster the fight against online child abuse.Javid said police would be given “game-changing” technological tools to bolster the fight against online child abuse.
According to the Home Office, the three new tools will help speed up investigations and limit the number of indecent images officers have to view.According to the Home Office, the three new tools will help speed up investigations and limit the number of indecent images officers have to view.
The technology, which cost £1.76m, aims to improve the capability of the Child Abuse Image Database, which holds millions of images.The technology, which cost £1.76m, aims to improve the capability of the Child Abuse Image Database, which holds millions of images.
PolicePolice
Facial recognitionFacial recognition
Human rightsHuman rights
PrivacyPrivacy
Metropolitan policeMetropolitan police
Sajid JavidSajid Javid
LondonLondon
newsnews
Share on FacebookShare on Facebook
Share on TwitterShare on Twitter
Share via EmailShare via Email
Share on LinkedInShare on LinkedIn
Share on PinterestShare on Pinterest
Share on WhatsAppShare on WhatsApp
Share on MessengerShare on Messenger
Reuse this contentReuse this content