This article is from the source 'guardian' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at https://www.theguardian.com/technology/2019/may/29/facial-recognition-must-not-introduce-gender-or-racial-bias-police-told

The article has changed 2 times. There is an RSS feed of changes available.

Version 0 Version 1
Facial recognition must not introduce gender or racial bias, police told Facial recognition must not introduce gender or racial bias, police told
(about 1 hour later)
Facial recognition software should only be used by police if they can prove it won’t introduce gender or racial bias to operations, an ethics panel has said. Facial recognition software should only be used by police if they can prove it will not introduce gender or racial bias to operations, an ethics panel has said.
A report by the London policing ethics panel, which was set up to advise City Hall, concluded that while there were “important ethical issues to be addressed” in the use of the controversial technology, they did not justify not using it at all.A report by the London policing ethics panel, which was set up to advise City Hall, concluded that while there were “important ethical issues to be addressed” in the use of the controversial technology, they did not justify not using it at all.
Live facial recognition (LFR) technology is designed to check people passing a camera in a public place against images on police databases, which can include suspects, missing people or persons of interest to the police.Live facial recognition (LFR) technology is designed to check people passing a camera in a public place against images on police databases, which can include suspects, missing people or persons of interest to the police.
The technology has been used to scan faces in large crowds in public places such as streets, shopping centres, football crowds and music events such as the Notting Hill carnival. The technology has been used to scan faces in large crowds in public places such as streets and shopping centres, and in football crowds and at events such as the Notting Hill carnival.
The Metropolitan police has carried out 10 trials using the technology across London, the most recent being in Romford town centre in mid-February. The Metropolitan police have carried out 10 trials using the technology across London, the most recent being in Romford town centre in mid-February.
In these trials the watchlist only contained images of individuals wanted by the Met and the courts for “violent-related offences”. Police said the trials led to a number of arrests based on positive identifications.In these trials the watchlist only contained images of individuals wanted by the Met and the courts for “violent-related offences”. Police said the trials led to a number of arrests based on positive identifications.
Facial recognition tech prevents crime, police tell UK privacy caseFacial recognition tech prevents crime, police tell UK privacy case
In a report following a review of the Met’s use of the software, the panel said it should only be used if the overall benefits to public safety was “great enough to outweigh any potential public distrust in the technology”. In a report following a review of the Met’s use of the software, the panel said it should only be used if the overall benefit to public safety was “great enough to outweigh any potential public distrust in the technology”.
The report said facial recognition software should not be used unless there was evidence that it would not generate gender or racial bias in policing operations. Concerns have been raised by scientific and civic groups that there are possible intrinsic biases in facial recognition technology, which may mean it is less effective at identifying BAME and female faces. The panel said the Met’s trials with the software were “a source of insight into any intrinsic bias, and should help to indicate how such bias would or would not feed forward into policing operations”.
Concerns have been raised by scientific and civic groups that there are possible intrinsic biases in facial recognition technology, which may mean it is less effective at identifying BAME and female faces.
The panel said the Met’s trials with the software were “a source of insight into any intrinsic bias, and should help to indicate how such bias would or would not feed forward into policing operations”.
“We argue it is in the public interest to publish the trial data and evaluations, to address these concerns,” the panel concluded. “Additionally, because the actions of human operators affect the technology’s functioning in the field and therefore the public’s experience of automated recognition, appropriate LFR operating procedures and practices need to be developed.”“We argue it is in the public interest to publish the trial data and evaluations, to address these concerns,” the panel concluded. “Additionally, because the actions of human operators affect the technology’s functioning in the field and therefore the public’s experience of automated recognition, appropriate LFR operating procedures and practices need to be developed.”
As part of their research the panel conducted a survey of a weighted sample of 1,092 Londoners into the police’s use of LFR. More than 57% felt its use by police was acceptable. This figure increased to 83% when respondents were asked whether the technology should be used to search for serious offenders.As part of their research the panel conducted a survey of a weighted sample of 1,092 Londoners into the police’s use of LFR. More than 57% felt its use by police was acceptable. This figure increased to 83% when respondents were asked whether the technology should be used to search for serious offenders.
Half of respondents thought the use of this software would make them feel safer, but more than a third said they were concerned about its impact on their privacy and that police would collect data on people who had not committed a crime. Only 56% of those surveyed thought that police would use their personal data in accordance with the law. Half of respondents thought the use of the software would make them feel safer, but more than a third said they were concerned about its impact on their privacy and that police would collect data on people who had not committed crimes. Only 56% of those surveyed thought that police would use their personal data in accordance with the law.
Almost half of respondents thought the technology would lead to personal information being collected about some groups more than others. Younger people were less accepting of police use of facial recognition technology than older people, and people from Asian and black ethnic groups were less accepting than those from white groups. Almost half of respondents thought the technology would lead to personal information being collected about some groups more than others. Younger people were less accepting of police use of facial recognition technology than older people, and Asian and black people were less accepting of it than white respondents.
The report comes after the information commissioner expressed concern last week over the lack of a formal legal framework for the use of facial recognition cameras by the police. The report comes after the information commissioner expressed concern last week over the lack of a formal legal framework for the use of facial recognition cameras by police.
The comments were made during a court hearing in the landmark case of Ed Bridges, an office worker from Cardiff, who claims South Wales police violated his privacy and data protection rights by using the technology on him when he went to buy a sandwich during his lunch break and when he attended a peaceful anti-arms demonstration. The comments were made during a court hearing in the landmark case of Ed Bridges, an office worker from Cardiff who claims South Wales police violated his privacy and data protection rights by using the technology on him when he went to buy a sandwich during his lunch break and when he attended a peaceful anti-arms demonstration.
The Metropolitan police welcomed the report. Det Ch Supt Ivan Balhatchet, who has led the force’s trials, said: “We want the public to have trust and confidence in the way we operate as a police service and we take the report’s findings seriously. The MPS will carefully consider the contents of the report before coming to any decision on the future use of this technology.” The Metropolitan police welcomed the report. DCS Ivan Balhatchet, who has led the force’s trials, said: “We want the public to have trust and confidence in the way we operate as a police service and we take the report’s findings seriously. The MPS will carefully consider the contents of the report before coming to any decision on the future use of this technology.”
Facial recognitionFacial recognition
PolicePolice
PrivacyPrivacy
LondonLondon
EthicsEthics
Metropolitan policeMetropolitan police
RaceRace
newsnews
Share on FacebookShare on Facebook
Share on TwitterShare on Twitter
Share via EmailShare via Email
Share on LinkedInShare on LinkedIn
Share on PinterestShare on Pinterest
Share on WhatsAppShare on WhatsApp
Share on MessengerShare on Messenger
Reuse this contentReuse this content