This article is from the source 'nytimes' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at https://www.nytimes.com/2019/03/15/technology/facebook-youtube-christchurch-shooting.html

The article has changed 11 times. There is an RSS feed of changes available.

Version 1 Version 2
A Mass Murder of, and for, the Internet A Mass Murder of, and for, the Internet
(32 minutes later)
Before entering a mosque in Christchurch, New Zealand, the site of one of the deadliest mass murders in the country’s history, the accused gunman paused to endorse a YouTube star in a video that appeared to capture the shooting.Before entering a mosque in Christchurch, New Zealand, the site of one of the deadliest mass murders in the country’s history, the accused gunman paused to endorse a YouTube star in a video that appeared to capture the shooting.
“Remember, lads, subscribe to PewDiePie,” he said.“Remember, lads, subscribe to PewDiePie,” he said.
To an untrained eye, this would have seemed like a bizarre detour.To an untrained eye, this would have seemed like a bizarre detour.
But the people watching the video stream recognized it as something entirely different: a meme.But the people watching the video stream recognized it as something entirely different: a meme.
Like many of the things the suspect appears to have done in preparation for the shooting on Friday — like posting a 74-page manifesto that named specific internet figures who had influenced his views, or writing that the video game Fortnite “trained me to be a killer” — the PewDiePie endorsement served two purposes. For his online followers, it was a kind of satirical Easter egg. (“Subscribe to PewDiePie,” which began as a grass-roots online attempt to keep the popular YouTube entertainer from being dethroned as the site’s most-followed account, has morphed into a kind of all-purpose cultural bat signal for the young and internet-absorbed.)Like many of the things the suspect appears to have done in preparation for the shooting on Friday — like posting a 74-page manifesto that named specific internet figures who had influenced his views, or writing that the video game Fortnite “trained me to be a killer” — the PewDiePie endorsement served two purposes. For his online followers, it was a kind of satirical Easter egg. (“Subscribe to PewDiePie,” which began as a grass-roots online attempt to keep the popular YouTube entertainer from being dethroned as the site’s most-followed account, has morphed into a kind of all-purpose cultural bat signal for the young and internet-absorbed.)
For everyone else, it was a booby trap, a joke designed to ensnare unsuspecting people and members of the media into taking it too literally. The goal, if there was one, may have been to pull a popular internet figure into a fractious blame game and inflame political tensions everywhere. (In a tweet early Friday morning, PewDiePie, whose real name is Felix Kjellberg, said, “I feel absolutely sickened having my name uttered by this person.”)For everyone else, it was a booby trap, a joke designed to ensnare unsuspecting people and members of the media into taking it too literally. The goal, if there was one, may have been to pull a popular internet figure into a fractious blame game and inflame political tensions everywhere. (In a tweet early Friday morning, PewDiePie, whose real name is Felix Kjellberg, said, “I feel absolutely sickened having my name uttered by this person.”)
The details that have emerged about the Christchurch shooting — at least 49 were killed at two mosques — are horrifying. But a surprising thing about it is how unmistakably online the violence was, and how aware the suspected gunman appears to have been about how his act would be viewed and interpreted by distinct internet subcultures.The details that have emerged about the Christchurch shooting — at least 49 were killed at two mosques — are horrifying. But a surprising thing about it is how unmistakably online the violence was, and how aware the suspected gunman appears to have been about how his act would be viewed and interpreted by distinct internet subcultures.
In some ways, it felt like a first — an internet-native mass shooting, conceived and produced entirely within the irony-soaked discourse of modern extremism.In some ways, it felt like a first — an internet-native mass shooting, conceived and produced entirely within the irony-soaked discourse of modern extremism.
The suspected gunman teased his act on Twitter, announced it on 8chan, and broadcast it live on Facebook. The footage was then replayed endlessly on YouTube, Twitter and Reddit, as the platforms scrambled to take down the clips nearly as fast as new copies popped up to replace them. In a statement on Twitter, Facebook said it had “quickly removed both the shooter’s Facebook and Instagram accounts and the video,” and was taking down instances of praise or support for the shooting. YouTube said it was “working vigilantly to remove any violent footage” of the attack. Reddit said in a statement that it was taking down “content containing links to the video stream or manifesto.” The suspected gunman teased his act on Twitter, announced it on the online messsage board 8chan, and broadcast it live on Facebook. The footage was then replayed endlessly on YouTube, Twitter and Reddit, as the platforms scrambled to take down the clips nearly as fast as new copies popped up to replace them. In a statement on Twitter, Facebook said it had “quickly removed both the shooter’s Facebook and Instagram accounts and the video,” and was taking down instances of praise or support for the shooting. YouTube said it was “working vigilantly to remove any violent footage” of the attack. Reddit said in a statement that it was taking down “content containing links to the video stream or manifesto.”
Even the language the suspect used to describe his attack before the fact framed it as an act of internet activism. In his post on 8chan, he referred to the shooting as a “real life effort post.” He titled an image “screw your optics,” a reference to a line posted by the man accused in the Pittsburgh synagogue shooting that later became a kind of catchphrase among neo-Nazis. And his manifesto — a wordy mixture of white nationalist boilerplate, fascist declarations and references to obscure internet jokes — seems to have been written from the bottom of an algorithmic rabbit hole.Even the language the suspect used to describe his attack before the fact framed it as an act of internet activism. In his post on 8chan, he referred to the shooting as a “real life effort post.” He titled an image “screw your optics,” a reference to a line posted by the man accused in the Pittsburgh synagogue shooting that later became a kind of catchphrase among neo-Nazis. And his manifesto — a wordy mixture of white nationalist boilerplate, fascist declarations and references to obscure internet jokes — seems to have been written from the bottom of an algorithmic rabbit hole.
It would be unfair to blame the internet for this. Motives are complex, lives are complicated, and we don’t yet know all the details about the shooting. The authorities in New Zealand have charged a man but have not identified him. Anti-Muslim violence is not an online phenomenon, and white nationalist hatred long predates 4Chan and Reddit.It would be unfair to blame the internet for this. Motives are complex, lives are complicated, and we don’t yet know all the details about the shooting. The authorities in New Zealand have charged a man but have not identified him. Anti-Muslim violence is not an online phenomenon, and white nationalist hatred long predates 4Chan and Reddit.
But we do know that the design of internet platforms can create and reinforce extremist beliefs. Their recommendation algorithms often steer users toward edgier content, a loop that results in more time spent on the app, and more advertising revenue for the company. Their hate speech policies are weakly enforced. And their practices for removing graphic videos — like the ones that circulated on social media for hours after the Christchurch shooting, despite the companies’ attempts to remove it — are inconsistent at best.But we do know that the design of internet platforms can create and reinforce extremist beliefs. Their recommendation algorithms often steer users toward edgier content, a loop that results in more time spent on the app, and more advertising revenue for the company. Their hate speech policies are weakly enforced. And their practices for removing graphic videos — like the ones that circulated on social media for hours after the Christchurch shooting, despite the companies’ attempts to remove it — are inconsistent at best.
We also know that many recent acts of offline violence bear the internet’s imprint. Robert Bowers, the man charged with killing 11 people and wounding six others at the Tree of Life synagogue in Pittsburgh, was a frequent user of Gab, a social media platform beloved by extremists. Cesar Sayoc, the man charged with sending explosives to prominent critics of President Trump last year, was immersed in a cesspool of right-wing Facebook and Twitter memes.We also know that many recent acts of offline violence bear the internet’s imprint. Robert Bowers, the man charged with killing 11 people and wounding six others at the Tree of Life synagogue in Pittsburgh, was a frequent user of Gab, a social media platform beloved by extremists. Cesar Sayoc, the man charged with sending explosives to prominent critics of President Trump last year, was immersed in a cesspool of right-wing Facebook and Twitter memes.
People used to conceive of “online extremism” as distinct from the extremism that took form in the physical world. If anything, the racism and bigotry on internet message boards felt a little less dangerous than the prospect of Ku Klux Klan marches or skinhead rallies.People used to conceive of “online extremism” as distinct from the extremism that took form in the physical world. If anything, the racism and bigotry on internet message boards felt a little less dangerous than the prospect of Ku Klux Klan marches or skinhead rallies.
Now, online extremism is just regular extremism on steroids. There is no offline equivalent of the experience of being algorithmically nudged toward a more strident version of your existing beliefs, or having an invisible hand steer you from gaming videos to neo-Nazism. The internet is now the place where the seeds of extremism are planted and watered, where platform incentives guide creators toward the ideological poles, and where people with hateful and violent beliefs can find and feed off one another.Now, online extremism is just regular extremism on steroids. There is no offline equivalent of the experience of being algorithmically nudged toward a more strident version of your existing beliefs, or having an invisible hand steer you from gaming videos to neo-Nazism. The internet is now the place where the seeds of extremism are planted and watered, where platform incentives guide creators toward the ideological poles, and where people with hateful and violent beliefs can find and feed off one another.
So the pattern continues. People become fluent in the culture of online extremism, they make and consume edgy memes, they cluster and harden. And once in a while, one of them erupts.So the pattern continues. People become fluent in the culture of online extremism, they make and consume edgy memes, they cluster and harden. And once in a while, one of them erupts.
In the coming days, we should attempt to find meaning in the lives of the victims of the Christchurch attack, and not glorify the attention-grabbing tactics of the alleged gunman. We should also address the specific horror of anti-Muslim violence.In the coming days, we should attempt to find meaning in the lives of the victims of the Christchurch attack, and not glorify the attention-grabbing tactics of the alleged gunman. We should also address the specific horror of anti-Muslim violence.
At the same time, we need to understand and address the poisonous pipeline of extremism that has emerged over the past several years, whose ultimate effects are impossible to quantify but clearly far too big to ignore. It’s not going away, and it’s not particularly getting better. We will feel it for years to come.At the same time, we need to understand and address the poisonous pipeline of extremism that has emerged over the past several years, whose ultimate effects are impossible to quantify but clearly far too big to ignore. It’s not going away, and it’s not particularly getting better. We will feel it for years to come.