This article is from the source 'nytimes' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at https://www.nytimes.com/2023/06/13/opinion/encryption-messaging-privacy-signal-whatsapp.html

The article has changed 7 times. There is an RSS feed of changes available.

Version 1 Version 2
One of the Last Bastions of Digital Privacy Is Under Threat One of the Last Bastions of Digital Privacy Is Under Threat
(about 1 hour later)
We might think of most of our day-to-day activities as private. Rarely is anyone deliberately eavesdropping on our conversations, spying on where we shop or following us on our commute. The government needs a search warrant or other court order to listen to our phone calls, to discover what books we checked out from the library or to read our mail.We might think of most of our day-to-day activities as private. Rarely is anyone deliberately eavesdropping on our conversations, spying on where we shop or following us on our commute. The government needs a search warrant or other court order to listen to our phone calls, to discover what books we checked out from the library or to read our mail.
But a tsunami of digital tracking technology has made a large portion of our lives public by default. Nearly everything we do online and on our phones — our movements, our conversations, our reading, watching and shopping habits — is being watched by commercial entities whose data can often be used by governments.But a tsunami of digital tracking technology has made a large portion of our lives public by default. Nearly everything we do online and on our phones — our movements, our conversations, our reading, watching and shopping habits — is being watched by commercial entities whose data can often be used by governments.
One of the last bastions of privacy are encrypted messaging programs such as Signal and WhatsApp. These apps, which employ a technology called end-to-end encryption, are designed so that even the app makers themselves cannot view their users’ messages. Texting on one of these apps — particularly if you use the “disappearing messages” feature — can be almost as private and ephemeral as most real-life conversations used to be. One of the last bastions of privacy is encrypted messaging programs such as Signal and WhatsApp. These apps, which employ a technology called end-to-end encryption, are designed so that even the app makers themselves cannot view their users’ messages. Texting on one of these apps — particularly if you use the “disappearing messages” feature — can be almost as private and ephemeral as most real-life conversations used to be.
However, governments are increasingly demanding that tech companies surveil encrypted messages in a new and dangerous way. For years, nations sought a master key to unlock encrypted content with a search warrant, but largely gave up because they couldn’t prove they could keep such a key safe from bad actors. Now they are seeking to force companies to monitor all their content, whether or not it is encrypted. However, governments are increasingly demanding that tech companies surveil encrypted messages in a new and dangerous way. For years, nations sought a master key to unlock encrypted content with a search warrant but largely gave up because they couldn’t prove they could keep such a key safe from bad actors. Now they are seeking to force companies to monitor all their content, whether or not it is encrypted.
The campaign to institute mass suspicionless searches is global. In Britain, the Online Safety Bill, which is making its way through Parliament, demands that messaging services identify and remove child exploitation images, “whether communicated publicly or privately by means of the service.” In the United States, bills introduced in Congress require online services to identify and remove such images. And in the European Union, a leaked memo has revealed that many member countries support weakening encryption as part of the fight against child exploitation.The campaign to institute mass suspicionless searches is global. In Britain, the Online Safety Bill, which is making its way through Parliament, demands that messaging services identify and remove child exploitation images, “whether communicated publicly or privately by means of the service.” In the United States, bills introduced in Congress require online services to identify and remove such images. And in the European Union, a leaked memo has revealed that many member countries support weakening encryption as part of the fight against child exploitation.
This surge of regulatory efforts is part of a larger worldwide concern about the prevalence of child exploitation images online. Although substantiated cases of child sexual abuse have thankfully been on a steep decline in the United States — down 63 percent since 1990, according to the University of New Hampshire Crimes Against Children Research Center — the prevalence of sexual images of children circulating online has risen sharply, swamping the National Center for Missing and Exploited Children’s CyberTipline with 32 million reports in 2022.This surge of regulatory efforts is part of a larger worldwide concern about the prevalence of child exploitation images online. Although substantiated cases of child sexual abuse have thankfully been on a steep decline in the United States — down 63 percent since 1990, according to the University of New Hampshire Crimes Against Children Research Center — the prevalence of sexual images of children circulating online has risen sharply, swamping the National Center for Missing and Exploited Children’s CyberTipline with 32 million reports in 2022.
The deluge of online reports reflects how images can be duplicated and shared limitlessly online, and also that there are more images available — not just ones that adults take of children, but also of images children and teenagers share with one another that are later shared publicly or commercialized, according to David Finkelhor, director of the University of New Hampshire center. The deluge of online reports reflects how images can be duplicated and shared limitlessly online and that there are more images available — not just ones that adults take of children but also of images children and teenagers share with one another that are later shared publicly or commercialized, according to David Finkelhor, the director of the University of New Hampshire center.
The recent legislative proposals are focused on detecting these images as they circulate online. But once you are in the business of scanning content, you are in the surveillance business — and that is not what we want from the companies that hold our most intimate communications.The recent legislative proposals are focused on detecting these images as they circulate online. But once you are in the business of scanning content, you are in the surveillance business — and that is not what we want from the companies that hold our most intimate communications.
Apple learned this lesson the hard way two years ago when it proposed a technical scheme that it claimed would be able to identify known child exploitation images on users’ devices without anyone actually looking at users’ photos.Apple learned this lesson the hard way two years ago when it proposed a technical scheme that it claimed would be able to identify known child exploitation images on users’ devices without anyone actually looking at users’ photos.
Apple’s proposal would have downloaded onto every device a secret list of IDs corresponding to known exploitation images. It would then use an algorithm to determine whether any photos on the device were similar to those on the list.Apple’s proposal would have downloaded onto every device a secret list of IDs corresponding to known exploitation images. It would then use an algorithm to determine whether any photos on the device were similar to those on the list.
There were two major problems. First, there was the possibility that the program might falsely label innocent photos as illegal. Like all matching algorithms, Apple’s system makes educated guesses based on statistical probabilities, but those guesses could be wrong. In a survey of technical papers about scanning systems like the one Apple proposed, two Princeton researchers, Sarah Scheffler and Jonathan Mayer, found that false positive rates ranged from 135 to 4.5 million false positives per day, assuming a worldwide 7.5 billion messages sent a day. That’s a lot of innocent messages that would have been forwarded to the police for investigation.There were two major problems. First, there was the possibility that the program might falsely label innocent photos as illegal. Like all matching algorithms, Apple’s system makes educated guesses based on statistical probabilities, but those guesses could be wrong. In a survey of technical papers about scanning systems like the one Apple proposed, two Princeton researchers, Sarah Scheffler and Jonathan Mayer, found that false positive rates ranged from 135 to 4.5 million false positives per day, assuming a worldwide 7.5 billion messages sent a day. That’s a lot of innocent messages that would have been forwarded to the police for investigation.
The second, and greater, problem was that scanning for one type of content opens the doors for scanning for other types of content. If Apple had a device-scanning system in place, India could demand scanning for illegal blasphemy, China could demand scanning for illegal anti-Communist content and U.S. states that have outlawed abortion or gender-affirming care could scan to identify people seeking those services. In other words, it would likely be a free-for-all for every type of surveillance out there.The second, and greater, problem was that scanning for one type of content opens the doors for scanning for other types of content. If Apple had a device-scanning system in place, India could demand scanning for illegal blasphemy, China could demand scanning for illegal anti-Communist content and U.S. states that have outlawed abortion or gender-affirming care could scan to identify people seeking those services. In other words, it would likely be a free-for-all for every type of surveillance out there.
There is a long history of surveillance technology being used initially for a benign or well-meaning purpose and morphing to a more sinister use. Taylor Swift, in 2018, pioneered using facial recognition at concerts to scan for known stalkers, but within a few years, Madison Square Garden was using the technology to block lawyers it was in a dispute with from entering the arena.There is a long history of surveillance technology being used initially for a benign or well-meaning purpose and morphing to a more sinister use. Taylor Swift, in 2018, pioneered using facial recognition at concerts to scan for known stalkers, but within a few years, Madison Square Garden was using the technology to block lawyers it was in a dispute with from entering the arena.
Thousands of privacy and security experts protested Apple’s plan to scan for images of abuse, signing an open letter saying it had the “potential to bypass any end-to-end encryption that would otherwise safeguard the user’s privacy.” Under pressure, Apple backed down.Thousands of privacy and security experts protested Apple’s plan to scan for images of abuse, signing an open letter saying it had the “potential to bypass any end-to-end encryption that would otherwise safeguard the user’s privacy.” Under pressure, Apple backed down.
The new legislative proposals — which would make companies liable for everything on their networks even if they can’t see it — will inevitably lead to enforcement efforts that aren’t much different than Apple’s doomed plan. And these new efforts may not even be constitutional. In the United States, a group of scholars wrote to the Senate last month to protest that forced scanning could violate the Fourth Amendment’s prohibition on unreasonable searches and seizures, “which precludes the government from having a private actor conduct a search it could not lawfully do itself.”The new legislative proposals — which would make companies liable for everything on their networks even if they can’t see it — will inevitably lead to enforcement efforts that aren’t much different than Apple’s doomed plan. And these new efforts may not even be constitutional. In the United States, a group of scholars wrote to the Senate last month to protest that forced scanning could violate the Fourth Amendment’s prohibition on unreasonable searches and seizures, “which precludes the government from having a private actor conduct a search it could not lawfully do itself.”
The question is philosophical, not technical. Do we want to start allowing the government to require companies to conduct suspicionless, warrantless searches of our messages with family, friends and co-workers?The question is philosophical, not technical. Do we want to start allowing the government to require companies to conduct suspicionless, warrantless searches of our messages with family, friends and co-workers?
Opening the door to dragnet searches of everyone’s phones for evidence of possible crime is closer to the work of intelligence agencies than of policing. And in the United States, we have largely restricted intelligence gathering to focus on foreigners and on national security issues such as terrorism. (And when intelligence gathering has gone too far in surveilling domestic Muslim communities, or everyone’s phone call records, lawmakers have condemned it and changed relevant laws.) Opening the door to dragnet searches of people’s phones for evidence of possible crime is closer to the work of intelligence agencies than of policing. And in the United States, we have largely restricted intelligence gathering to focus on foreigners and on national security issues such as terrorism. (And when intelligence gathering has gone too far in surveilling domestic Muslim communities or people’s phone call records, lawmakers have condemned it and changed relevant laws.)
Under current law, nothing is stopping the police from getting a search warrant to examine the devices of those whom they suspect of a crime. And despite the F.B.I.’s claims that encryption hurts its ability to catch criminals, the agency has had some spectacular successes overcoming encryption. Among them: using an Australian hacking firm in 2016 to unlock the encrypted iPhone of the San Bernardino mass murderer, and obtaining data from Signal messages that led to the conviction of members of the Oath Keepers organization for their role in the Jan. 6 insurrection. Under current law, nothing is stopping the police from getting a search warrant to examine the devices of those whom they suspect of a crime. And despite the F.B.I.’s claims that encryption hurts its ability to catch criminals, the agency has had some spectacular successes overcoming encryption. Among them: using an Australian hacking firm in 2016 to unlock the encrypted iPhone of the San Bernardino mass murderer and obtaining data from Signal messages that led to the conviction of members of the Oath Keepers organization for their role in the Jan. 6 insurrection.
Search warrants have long been the line we have drawn against overly intrusive government surveillance. We need to hold that line and remind lawmakers: No warrant, no data.Search warrants have long been the line we have drawn against overly intrusive government surveillance. We need to hold that line and remind lawmakers: No warrant, no data.
Julia Angwin is a contributing Opinion writer, an investigative journalist and the author of “Dragnet Nation: A Quest for Privacy, Security and Freedom in a World of Relentless Surveillance.”
Source photographs by Runstudio and Dougal Waters, via Getty Images, and Hans Rodenbröker.Source photographs by Runstudio and Dougal Waters, via Getty Images, and Hans Rodenbröker.
The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: letters@nytimes.com.The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: letters@nytimes.com.
Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.