This article is from the source 'washpo' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at https://www.washingtonpost.com/opinions/a-scary-new-facial-recognition-tool-underlines-the-urgent-need-for-privacy-laws/2020/01/23/6c2646a8-3d37-11ea-baca-eb7ace0a3455_story.html

The article has changed 2 times. There is an RSS feed of changes available.

Version 0 Version 1
A scary new facial recognition tool underlines the urgent need for privacy laws Our privacy doomsday could come sooner than we think
(about 1 hour later)
PRIVACY DOOMSAYERS have always said the failure to regulate surveillance technology would result in the end of anonymity. But we didn’t realize doom might come this soon.PRIVACY DOOMSAYERS have always said the failure to regulate surveillance technology would result in the end of anonymity. But we didn’t realize doom might come this soon.
The New York Times reports on a facial recognition tool that hasn’t made the news before — because it’s not from a Silicon Valley luminary with a big public footprint, such as Amazon or Google, but rather a tiny and secretive start-up called Clearview AI whose impact is as high as its profile is low. More than 600 law enforcement agencies use the company’s tool, which depends on a database of more than 3 billion images gathered from millions of websites, including Facebook, Instagram, Twitter and YouTube.The New York Times reports on a facial recognition tool that hasn’t made the news before — because it’s not from a Silicon Valley luminary with a big public footprint, such as Amazon or Google, but rather a tiny and secretive start-up called Clearview AI whose impact is as high as its profile is low. More than 600 law enforcement agencies use the company’s tool, which depends on a database of more than 3 billion images gathered from millions of websites, including Facebook, Instagram, Twitter and YouTube.
The facial recognition searches that authorities conduct today are usually restricted to mug shots or driver’s license photos, and there are serious concerns even about those. Now, that database has expanded exponentially, and it could envelop any American whose face has ever appeared publicly on the Internet. Clearview’s results return not only names but also phone numbers, addresses and more. The risks are familiar but amplified many times over.The facial recognition searches that authorities conduct today are usually restricted to mug shots or driver’s license photos, and there are serious concerns even about those. Now, that database has expanded exponentially, and it could envelop any American whose face has ever appeared publicly on the Internet. Clearview’s results return not only names but also phone numbers, addresses and more. The risks are familiar but amplified many times over.
There’s the possibility of false matches, especially because Clearview’s algorithm has never been tested by an outside authority. There’s the risk of a hack exposing a trove of human beings converted into identifiable lines of code. And, of course, there’s the risk of abuse — police at a protest, for example, training the tool on every participant, or a private buyer using it to stalk an ex-lover.There’s the possibility of false matches, especially because Clearview’s algorithm has never been tested by an outside authority. There’s the risk of a hack exposing a trove of human beings converted into identifiable lines of code. And, of course, there’s the risk of abuse — police at a protest, for example, training the tool on every participant, or a private buyer using it to stalk an ex-lover.
Clearview has pitched its product to entities with less credibility than law enforcement, including a professed “pro-white” Republican congressional candidate for “extreme opposition research.” The company can even monitor clients’ searches, and evidently it does: How else did Clearview know to flag the Times reporter’s face and display no matches after she asked officers to run her through the app? (A “software bug,” the founder replied.)Clearview has pitched its product to entities with less credibility than law enforcement, including a professed “pro-white” Republican congressional candidate for “extreme opposition research.” The company can even monitor clients’ searches, and evidently it does: How else did Clearview know to flag the Times reporter’s face and display no matches after she asked officers to run her through the app? (A “software bug,” the founder replied.)
The case underscores with greater vigor than ever the need for restrictions on facial recognition technology. But putting limits on what the police or private businesses can do with a tool such as Clearview’s won’t stop bad actors from breaking them. There also need to be limits on whether a tool such as Clearview’s can exist in this country in the first place.The case underscores with greater vigor than ever the need for restrictions on facial recognition technology. But putting limits on what the police or private businesses can do with a tool such as Clearview’s won’t stop bad actors from breaking them. There also need to be limits on whether a tool such as Clearview’s can exist in this country in the first place.
Top platforms’ policies generally prohibit the sort of data-scraping Clearview has engaged in, but it’s difficult for a company to protect information that’s on the open Web. Courts have also ruled against platforms when they have tried to go after scrapers under existing copyright or computer fraud law — and understandably, as too-onerous restrictions could hurt journalists and public-interest groups.Top platforms’ policies generally prohibit the sort of data-scraping Clearview has engaged in, but it’s difficult for a company to protect information that’s on the open Web. Courts have also ruled against platforms when they have tried to go after scrapers under existing copyright or computer fraud law — and understandably, as too-onerous restrictions could hurt journalists and public-interest groups.
Privacy legislation is a more promising area for action, to prevent third parties Privacy legislation is a more promising area for action, to prevent third parties
includingincluding
Clearview from assembling databases such as these in the first place, whether they’re filled with faces or location records or credit scores. That will take exactly the robust federal framework Congress has so far failed to provide, and a government that’s ready to enforce it. Clearview from assembling databases such as these in the first place, whether they’re filled with faces or location records or credit scores. That will take exactly the robust federal framework Congress has so far failed to provide, and a government that’s ready to enforce it.
Read more:Read more:
The Post’s View: Why Congress needs to regulate facial-recognition systemsThe Post’s View: Why Congress needs to regulate facial-recognition systems
The Post’s View: The facial-recognition future we feared is hereThe Post’s View: The facial-recognition future we feared is here
Stephanie Hare: What is it like when police go rogue in a liberal democracy? Look to Britain.Stephanie Hare: What is it like when police go rogue in a liberal democracy? Look to Britain.
The Post’s View: San Francisco banned facial recognition software. A better strategy would be a moratorium.The Post’s View: San Francisco banned facial recognition software. A better strategy would be a moratorium.
The Post’s View: U.S. Customs was right to reverse course on mandatory facial recognition scansThe Post’s View: U.S. Customs was right to reverse course on mandatory facial recognition scans