This article is from the source 'guardian' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at https://www.theguardian.com/us-news/2019/aug/16/its-techno-racism-detroit-is-quietly-using-facial-recognition-to-make-arrests

The article has changed 2 times. There is an RSS feed of changes available.

Version 0 Version 1
‘It's techno-racism’: Detroit is quietly using facial recognition to make arrests ‘It's techno-racism’: Detroit is quietly using facial recognition to make arrests
(3 days later)
For the last two years, Detroit police have been quietly utilizing controversial and unreliable facial recognition technology to make arrests in the city.For the last two years, Detroit police have been quietly utilizing controversial and unreliable facial recognition technology to make arrests in the city.
The news, revealed in May in a Georgetown University report , has shocked many Detroiters and sparked a public debate in the city that is still raging and mirrors similar battles playing out elsewhere in America and across the world. Among other issues, critics in the majority-black city point out that flawed facial recognition software misidentifies people of color and women at much higher rates.The news, revealed in May in a Georgetown University report , has shocked many Detroiters and sparked a public debate in the city that is still raging and mirrors similar battles playing out elsewhere in America and across the world. Among other issues, critics in the majority-black city point out that flawed facial recognition software misidentifies people of color and women at much higher rates.
Detroit also now has the capability to use the technology to monitor residents in real time, though Detroit’s police chief claims it won’t.Detroit also now has the capability to use the technology to monitor residents in real time, though Detroit’s police chief claims it won’t.
'I think my blackness is interfering': does facial recognition show racial bias?'I think my blackness is interfering': does facial recognition show racial bias?
Willie Burton, a black member of the civilian Detroit Police Commission that oversees the department, noted Detroit’s population is 83% black and that made using the technology especially worrying.Willie Burton, a black member of the civilian Detroit Police Commission that oversees the department, noted Detroit’s population is 83% black and that made using the technology especially worrying.
“This should be the last place police use the technology because it can’t identify one black man or woman to another,” he said. “Every black man with a beard looks alike to it. Every black man with a hoodie looks alike. This is techno-racism.”“This should be the last place police use the technology because it can’t identify one black man or woman to another,” he said. “Every black man with a beard looks alike to it. Every black man with a hoodie looks alike. This is techno-racism.”
At a July meeting on the issue held by the police commission, arguments got so heated over facial recognition that officers arrested and temporarily jailed Burton as he loudly objected to its use.At a July meeting on the issue held by the police commission, arguments got so heated over facial recognition that officers arrested and temporarily jailed Burton as he loudly objected to its use.
The technology presents obvious questions over whether police are violating residents’ privacy protections. Detroit’s facial recognition software makes it much easier for the city to track people’s movements across time while efficiently and secretly gathering personal information, said Clare Garvie, an author of the report from the Georgetown Law Center on Privacy and Technology.The technology presents obvious questions over whether police are violating residents’ privacy protections. Detroit’s facial recognition software makes it much easier for the city to track people’s movements across time while efficiently and secretly gathering personal information, said Clare Garvie, an author of the report from the Georgetown Law Center on Privacy and Technology.
“It can betray information about sensitive locations – who someone is as a person, if they’re going to church, an HIV clinic, and the supreme court has said we have a right to privacy even if we are in public,” she said.“It can betray information about sensitive locations – who someone is as a person, if they’re going to church, an HIV clinic, and the supreme court has said we have a right to privacy even if we are in public,” she said.
Garvie conservatively estimates that a quarter of the nation’s 18,000 police agencies now use facial recognition technology, and over half of American adults’ photos are available for investigation.Garvie conservatively estimates that a quarter of the nation’s 18,000 police agencies now use facial recognition technology, and over half of American adults’ photos are available for investigation.
Chicago runs a similar program as that in Detroit, while the Los Angeles police department may be operating a small number of cameras that track the public in real time.Chicago runs a similar program as that in Detroit, while the Los Angeles police department may be operating a small number of cameras that track the public in real time.
Meanwhile, some local governments are proposing regulations to limit it. San Francisco and Oakland in California and Cambridge and Sommerville in Massachusetts have recently banned the technology. Florida’s Orlando scrapped a pilot real-time surveillance program after the software proved to be unreliable, and New York governor Andrew Cuomo is attempting to implement facial recognition software in New York City with no success, so far. Meanwhile, some local governments are proposing regulations to limit it. San Francisco and Oakland in California and Cambridge and Somerville in Massachusetts have recently banned the technology. Florida’s Orlando scrapped a pilot real-time surveillance program after the software proved to be unreliable, and New York governor Andrew Cuomo is attempting to implement facial recognition software in New York City with no success, so far.
At the federal level, Congress in May held hearings on the issue. Congresswoman Rashida Tlaib, whose district includes parts of Detroit, recently introduced legislation that would prohibit its use at public housing.At the federal level, Congress in May held hearings on the issue. Congresswoman Rashida Tlaib, whose district includes parts of Detroit, recently introduced legislation that would prohibit its use at public housing.
“Policing our communities has become more militarized and flawed,” Tlaib said during the 22 May hearing. “Now we have for-profit companies pushing so-called technology that has never been tested in communities of color, let alone been studied enough to conclude that it makes our communities safer.”“Policing our communities has become more militarized and flawed,” Tlaib said during the 22 May hearing. “Now we have for-profit companies pushing so-called technology that has never been tested in communities of color, let alone been studied enough to conclude that it makes our communities safer.”
But facial recognition software is just the latest in Detroit’s development of a comprehensive public surveillance apparatus that includes multiple camera programs.But facial recognition software is just the latest in Detroit’s development of a comprehensive public surveillance apparatus that includes multiple camera programs.
As part of its Project Green Light, the city installed nearly 600 high definition cameras at intersections, schools, churches, public parks, immigration centers, addiction treatment centers, apartment buildings, fast food restaurants, and other businesses around the city.As part of its Project Green Light, the city installed nearly 600 high definition cameras at intersections, schools, churches, public parks, immigration centers, addiction treatment centers, apartment buildings, fast food restaurants, and other businesses around the city.
Police pull still images from those and thousands of other private cameras, then use facial recognition software to cross-reference them with millions of photos pulled from a mugshot database, driver’s license photos, and images scraped from social media.Police pull still images from those and thousands of other private cameras, then use facial recognition software to cross-reference them with millions of photos pulled from a mugshot database, driver’s license photos, and images scraped from social media.
Were Detroit to start using the software in real time, it could continually scan those entering any location covered by its cameras, or motorists and pedestrians traveling through an intersection, for example.Were Detroit to start using the software in real time, it could continually scan those entering any location covered by its cameras, or motorists and pedestrians traveling through an intersection, for example.
Though there’s no oversight, Detroit police chief James Craig insists the department won’t use real time software and only runs still images as an “investigative tool” for violent crimes.Though there’s no oversight, Detroit police chief James Craig insists the department won’t use real time software and only runs still images as an “investigative tool” for violent crimes.
Police say any match requires “sufficient corroboration” before an arrest can be made. But Garvie notes the software has already lead to false arrests elsewhere in the country.Police say any match requires “sufficient corroboration” before an arrest can be made. But Garvie notes the software has already lead to false arrests elsewhere in the country.
Facial recognition technology’s premise “flips on its head” the idea of innocent until proven guilty, Garvie said at a recent Detroit forum on the topic.Facial recognition technology’s premise “flips on its head” the idea of innocent until proven guilty, Garvie said at a recent Detroit forum on the topic.
“Biometrically identifying everyone and checking them against a watch list or their criminal history assumes they’re guilty until they prove they’re innocent by not having a record,” she said. “That’s not going to make us more secure. It’s going to make us more afraid.”“Biometrically identifying everyone and checking them against a watch list or their criminal history assumes they’re guilty until they prove they’re innocent by not having a record,” she said. “That’s not going to make us more secure. It’s going to make us more afraid.”
A Detroit police spokesperson couldn’t say how many arrests involved the technology, though Craig told the Guardian no false arrests have been made. He acknowledged issues with accuracy, but stressed that matches are treated as a lead and go through a rigorous review process.A Detroit police spokesperson couldn’t say how many arrests involved the technology, though Craig told the Guardian no false arrests have been made. He acknowledged issues with accuracy, but stressed that matches are treated as a lead and go through a rigorous review process.
“Facial recognition is only part of methodical investigation to identify and confirm that the suspect is involved in that crime,” he said.“Facial recognition is only part of methodical investigation to identify and confirm that the suspect is involved in that crime,” he said.
Some residents say the technology is already sowing more distrust in Detroit as civil rights advocates accuse the city of intentionally muddying the waters. Georgetown’s report noted police did not mention on the Green Light website that cameras would be used with facial recognition software, and property owners who installed them weren’t made aware of it. Some residents say the technology is already sowing more distrust in Detroit as civil rights advocates accuse the city of intentionally muddying the waters. Georgetown’s report noted police did not mention on the Green Light website that cameras would be used with facial recognition software, and property owners who installed them weren’t made aware of it.
“There’s been no transparency and we won’t stand for it,” Burton said. “We don’t want it here, and we are going to fight back because we deserve better.”“There’s been no transparency and we won’t stand for it,” Burton said. “We don’t want it here, and we are going to fight back because we deserve better.”
US newsUS news
DetroitDetroit
US policingUS policing
Facial recognitionFacial recognition
newsnews
Share on FacebookShare on Facebook
Share on TwitterShare on Twitter
Share via EmailShare via Email
Share on LinkedInShare on LinkedIn
Share on PinterestShare on Pinterest
Share on WhatsAppShare on WhatsApp
Share on MessengerShare on Messenger
Reuse this contentReuse this content