This article is from the source 'nytimes' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at https://www.nytimes.com/2019/06/24/opinion/future-free-speech-social-media-platforms.html

The article has changed 5 times. There is an RSS feed of changes available.

Version 1 Version 2
I Shouldn’t Have to Publish This in The New York Times I Shouldn’t Have to Publish This in The New York Times
(32 minutes later)
Editors’ note: This is part of a series, “Op-Eds From the Future,” in which science fiction authors, futurists, philosophers and scientists write Op-Eds that they imagine we might read 10, 20 or even 100 years from now. The challenges they predict are imaginary — for now — but their arguments illuminate the urgent questions of today and prepare us for tomorrow. The opinion piece below is a work of fiction.Editors’ note: This is part of a series, “Op-Eds From the Future,” in which science fiction authors, futurists, philosophers and scientists write Op-Eds that they imagine we might read 10, 20 or even 100 years from now. The challenges they predict are imaginary — for now — but their arguments illuminate the urgent questions of today and prepare us for tomorrow. The opinion piece below is a work of fiction.
I shouldn’t have to publish this in The New York Times.I shouldn’t have to publish this in The New York Times.
Ten years ago, I could have published this on my personal website, or shared it on one of the big social media platforms. But that was before the United States government decided to regulate both the social media platforms and blogging sites as if they were newspapers, making them legally responsible for the content they published.Ten years ago, I could have published this on my personal website, or shared it on one of the big social media platforms. But that was before the United States government decided to regulate both the social media platforms and blogging sites as if they were newspapers, making them legally responsible for the content they published.
The move was spurred on by an unholy and unlikely coalition of media companies crying copyright; national security experts wringing their hands about terrorism; and people who were dismayed that our digital public squares had become infested by fascists, harassers and cybercriminals. Bit by bit, the legal immunity of the platforms was eroded — from the judges who put Facebook on the line for the platform’s inaction during the Provo Uprising to the lawmakers who amended section 230 of the Communications Decency Act in a bid to get Twitter to clean up its Nazi problem.The move was spurred on by an unholy and unlikely coalition of media companies crying copyright; national security experts wringing their hands about terrorism; and people who were dismayed that our digital public squares had become infested by fascists, harassers and cybercriminals. Bit by bit, the legal immunity of the platforms was eroded — from the judges who put Facebook on the line for the platform’s inaction during the Provo Uprising to the lawmakers who amended section 230 of the Communications Decency Act in a bid to get Twitter to clean up its Nazi problem.
[Cory Doctorow will answer your questions about his “Op-Ed from the Future” on Tuesday at 3 p.m. Eastern.]
While the media in the United States remained protected by the First Amendment, members of the press in other countries were not so lucky. The rest of the world responded to the crisis by tightening rules on acceptable speech. But even the most prolific news service — a giant wire service like AP-AFP or Thomson-Reuters-TransCanada-Huawei — only publishes several thousand articles per day. And thanks to their armies of lawyers, editors and insurance underwriters, they are able to make the news available without falling afoul of new rules prohibiting certain kinds of speech — including everything from Saudi blasphemy rules to Austria’s ban on calling politicians “fascists” to Thailand’s stringent lèse-majesté rules. They can ensure that news in Singapore is not “out of bounds” and that op-eds in Britain don’t call for the abolition of the monarchy.While the media in the United States remained protected by the First Amendment, members of the press in other countries were not so lucky. The rest of the world responded to the crisis by tightening rules on acceptable speech. But even the most prolific news service — a giant wire service like AP-AFP or Thomson-Reuters-TransCanada-Huawei — only publishes several thousand articles per day. And thanks to their armies of lawyers, editors and insurance underwriters, they are able to make the news available without falling afoul of new rules prohibiting certain kinds of speech — including everything from Saudi blasphemy rules to Austria’s ban on calling politicians “fascists” to Thailand’s stringent lèse-majesté rules. They can ensure that news in Singapore is not “out of bounds” and that op-eds in Britain don’t call for the abolition of the monarchy.
But not the platforms — they couldn’t hope to make a dent in their users’ personal expressions. From YouTube’s 2,000 hours of video uploaded every minute to Facebook-Weibo’s three billion daily updates, there was no scalable way to carefully examine the contributions of every user and assess whether they violated any of these new laws. So the platforms fixed this the Silicon Valley way: They automated it. Badly.But not the platforms — they couldn’t hope to make a dent in their users’ personal expressions. From YouTube’s 2,000 hours of video uploaded every minute to Facebook-Weibo’s three billion daily updates, there was no scalable way to carefully examine the contributions of every user and assess whether they violated any of these new laws. So the platforms fixed this the Silicon Valley way: They automated it. Badly.
Which is why I have to publish this in The New York Times.Which is why I have to publish this in The New York Times.
The platforms and personal websites are fine if you want to talk about sports, relate your kids’ latest escapades or shop. But if you want to write something about how the platforms and government legislation can’t tell the difference between sex trafficking and sex, nudity and pornography, terrorism investigations and terrorism itself or copyright infringement and parody, you’re out of luck. Any one of those keywords will give the filters an incurable case of machine anxiety — but all of them together? Forget it.The platforms and personal websites are fine if you want to talk about sports, relate your kids’ latest escapades or shop. But if you want to write something about how the platforms and government legislation can’t tell the difference between sex trafficking and sex, nudity and pornography, terrorism investigations and terrorism itself or copyright infringement and parody, you’re out of luck. Any one of those keywords will give the filters an incurable case of machine anxiety — but all of them together? Forget it.
If you’re thinking, “Well, all that stuff belongs in the newspaper,” then you’ve fallen into a trap: Democracies aren’t strengthened when a professional class gets to tell us what our opinions are allowed to be.If you’re thinking, “Well, all that stuff belongs in the newspaper,” then you’ve fallen into a trap: Democracies aren’t strengthened when a professional class gets to tell us what our opinions are allowed to be.
And the worst part is, the new regulations haven’t ended harassment, extremism or disinformation. Hardly a day goes by without some post full of outright Naziism, flat-eartherism and climate trutherism going viral. There are whole armies of Nazis and conspiracy theorists who do nothing but test the filters, day and night, using custom software to find the adversarial examples that slip past the filters’ machine-learning classifiers.And the worst part is, the new regulations haven’t ended harassment, extremism or disinformation. Hardly a day goes by without some post full of outright Naziism, flat-eartherism and climate trutherism going viral. There are whole armies of Nazis and conspiracy theorists who do nothing but test the filters, day and night, using custom software to find the adversarial examples that slip past the filters’ machine-learning classifiers.
It didn’t have to be this way. Once upon a time, the internet teemed with experimental, personal publications. The mergers and acquisitions and anticompetitive bullying that gave rise to the platforms and killed personal publishing made Big Tech both reviled and powerful, and they were targeted for breakups by ambitious lawmakers. Had we gone that route, we might have an internet that was robust, resilient, variegated and dynamic.It didn’t have to be this way. Once upon a time, the internet teemed with experimental, personal publications. The mergers and acquisitions and anticompetitive bullying that gave rise to the platforms and killed personal publishing made Big Tech both reviled and powerful, and they were targeted for breakups by ambitious lawmakers. Had we gone that route, we might have an internet that was robust, resilient, variegated and dynamic.
Think back to the days when companies like Apple and Google — back when they were stand-alone companies — bought hundreds of start-ups every year. What if we’d put a halt to the practice, re-establishing the traditional antitrust rules against “mergers to monopoly” and acquiring your nascent competitors? What if we’d established an absolute legal defense for new market entrants seeking to compete with established monopolists?Think back to the days when companies like Apple and Google — back when they were stand-alone companies — bought hundreds of start-ups every year. What if we’d put a halt to the practice, re-establishing the traditional antitrust rules against “mergers to monopoly” and acquiring your nascent competitors? What if we’d established an absolute legal defense for new market entrants seeking to compete with established monopolists?
Most of these new companies would have failed — if only because most new ventures fail — but the survivors would have challenged the Big Tech giants, eroding their profits and giving them less lobbying capital. They would have competed to give the best possible deals to the industries that tech was devouring, like entertainment and news. And they would have competed with the news and entertainment monopolies to offer better deals to the pixel-stained wretches who produced the “content” that was the source of all their profits.Most of these new companies would have failed — if only because most new ventures fail — but the survivors would have challenged the Big Tech giants, eroding their profits and giving them less lobbying capital. They would have competed to give the best possible deals to the industries that tech was devouring, like entertainment and news. And they would have competed with the news and entertainment monopolies to offer better deals to the pixel-stained wretches who produced the “content” that was the source of all their profits.
But instead, we decided to vest the platforms with statelike duties to punish them for their domination. In doing so, we cemented that domination. Only the largest companies can afford the kinds of filters we’ve demanded of them, and that means that any would-be trustbuster who wants to break up the companies and bring them to heel first must unwind the mesh of obligations we’ve ensnared the platforms in and build new, state-based mechanisms to perform those duties.But instead, we decided to vest the platforms with statelike duties to punish them for their domination. In doing so, we cemented that domination. Only the largest companies can afford the kinds of filters we’ve demanded of them, and that means that any would-be trustbuster who wants to break up the companies and bring them to heel first must unwind the mesh of obligations we’ve ensnared the platforms in and build new, state-based mechanisms to perform those duties.
Our first mistake was giving the platforms the right to decide who could speak and what they could say. Our second mistake was giving them the duty to make that call, a billion times a day.Our first mistake was giving the platforms the right to decide who could speak and what they could say. Our second mistake was giving them the duty to make that call, a billion times a day.
Still, I am hopeful, if not optimistic. Google did not exist 30 years ago; perhaps in 30 years’ time, it will be a distant memory. It seems unlikely, but then again, so did the plan to rescue Miami and the possibility of an independent Tibet — two subjects that are effectively impossible to discuss on the platforms. In a world where so much else is up for grabs, finally, perhaps, we can once again reach for a wild, woolly, independent and free internet.Still, I am hopeful, if not optimistic. Google did not exist 30 years ago; perhaps in 30 years’ time, it will be a distant memory. It seems unlikely, but then again, so did the plan to rescue Miami and the possibility of an independent Tibet — two subjects that are effectively impossible to discuss on the platforms. In a world where so much else is up for grabs, finally, perhaps, we can once again reach for a wild, woolly, independent and free internet.
It’s still within our reach: an internet that doesn’t force us to choose between following the algorithmically enforced rules or disappearing from the public discourse; an internet where we can host our own discussions and debate the issues of the day without worrying that our words will disappear. In the meantime, here I am, forced to publish in The New York Times. If only that were a “scalable solution,” you could do so as well.It’s still within our reach: an internet that doesn’t force us to choose between following the algorithmically enforced rules or disappearing from the public discourse; an internet where we can host our own discussions and debate the issues of the day without worrying that our words will disappear. In the meantime, here I am, forced to publish in The New York Times. If only that were a “scalable solution,” you could do so as well.
Cory Doctorow (@doctorow) is a science fiction writer whose latest book is “Radicalized,” a special consultant to the Electronic Frontier Foundation and an M.I.T. Media Lab research affiliate.Cory Doctorow (@doctorow) is a science fiction writer whose latest book is “Radicalized,” a special consultant to the Electronic Frontier Foundation and an M.I.T. Media Lab research affiliate.
The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: letters@nytimes.com.The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: letters@nytimes.com.
Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.