This article is from the source 'nytimes' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at https://www.nytimes.com/2019/06/05/business/youtube-remove-extremist-videos.html

The article has changed 4 times. There is an RSS feed of changes available.

Version 0 Version 1
YouTube to Remove Thousands of Videos Pushing Extreme Views YouTube to Remove Thousands of Videos Pushing Extreme Views
(32 minutes later)
YouTube announced plans on Wednesday to remove thousands of videos and channels that advocate for neo-Nazism, white supremacy and other bigoted ideologies in an attempt to clean up extremism and hate speech on its popular service.YouTube announced plans on Wednesday to remove thousands of videos and channels that advocate for neo-Nazism, white supremacy and other bigoted ideologies in an attempt to clean up extremism and hate speech on its popular service.
The new policy will ban “videos alleging that a group is superior in order to justify discrimination, segregation or exclusion,” the company said in a blog post. The prohibition will also cover videos denying that violent incidents, like the mass shooting at Sandy Hook Elementary School in Connecticut, took place.The new policy will ban “videos alleging that a group is superior in order to justify discrimination, segregation or exclusion,” the company said in a blog post. The prohibition will also cover videos denying that violent incidents, like the mass shooting at Sandy Hook Elementary School in Connecticut, took place.
YouTube did not name any specific channels or videos that would be banned.YouTube did not name any specific channels or videos that would be banned.
“It’s our responsibility to protect that, and prevent our platform from being used to incite hatred, harassment, discrimination and violence,” the company said in the blog post.“It’s our responsibility to protect that, and prevent our platform from being used to incite hatred, harassment, discrimination and violence,” the company said in the blog post.
The decision by YouTube, which is owned by Google, is the latest action by a Silicon Valley company to stem the spread of hate speech and disinformation on its site. A month ago, Facebook evicted seven of its most controversial users, including Alex Jones, the conspiracy theorist and founder of InfoWars. Twitter banned Mr. Jones last year.The decision by YouTube, which is owned by Google, is the latest action by a Silicon Valley company to stem the spread of hate speech and disinformation on its site. A month ago, Facebook evicted seven of its most controversial users, including Alex Jones, the conspiracy theorist and founder of InfoWars. Twitter banned Mr. Jones last year.
The companies have come under intense criticism for their delayed reaction to the spread of hateful and false content. At the same time, President Trump and others argue that the giant tech platforms censor right-wing opinions, and the new policies put in place by the companies have inflamed those debates.The companies have come under intense criticism for their delayed reaction to the spread of hateful and false content. At the same time, President Trump and others argue that the giant tech platforms censor right-wing opinions, and the new policies put in place by the companies have inflamed those debates.
The tension was evident on Tuesday, when YouTube said that a prominent right-wing creator who used racial language and homophobic slurs to harass a journalist in videos on YouTube did not violate its policies. The decision set off a firestorm online, including accusations that YouTube was giving a free pass to some of its popular creators.The tension was evident on Tuesday, when YouTube said that a prominent right-wing creator who used racial language and homophobic slurs to harass a journalist in videos on YouTube did not violate its policies. The decision set off a firestorm online, including accusations that YouTube was giving a free pass to some of its popular creators.
In the videos, that creator, Steven Crowder, a conservative commentator with nearly four million YouTube subscribers, repeatedly insulted Carlos Maza, a journalist from Vox. Mr. Crowder used slurs about Mr. Maza’s Cuban-American ethnicity and sexual orientation. Mr. Crowder said that his comments were harmless, and YouTube determined they did not break its rules.In the videos, that creator, Steven Crowder, a conservative commentator with nearly four million YouTube subscribers, repeatedly insulted Carlos Maza, a journalist from Vox. Mr. Crowder used slurs about Mr. Maza’s Cuban-American ethnicity and sexual orientation. Mr. Crowder said that his comments were harmless, and YouTube determined they did not break its rules.
“Opinions can be deeply offensive, but if they don’t violate our policies, they’ll remain on our site,” YouTube said in a statement about its decision on Mr. Crowder.“Opinions can be deeply offensive, but if they don’t violate our policies, they’ll remain on our site,” YouTube said in a statement about its decision on Mr. Crowder.
The back-to-back decisions illustrated a central theme that has defined the moderation struggles of social media companies: Making rules is often easier than enforcing them.The back-to-back decisions illustrated a central theme that has defined the moderation struggles of social media companies: Making rules is often easier than enforcing them.
“This is an important and long-overdue change,” Becca Lewis, a research affiliate at the nonprofit organization Data & Society, said about the new policy. “However, YouTube has often executed its community guidelines unevenly, so it remains to be seen how effective these updates will be.”“This is an important and long-overdue change,” Becca Lewis, a research affiliate at the nonprofit organization Data & Society, said about the new policy. “However, YouTube has often executed its community guidelines unevenly, so it remains to be seen how effective these updates will be.”
YouTube’s scale — more than 500 hours of new videos are uploaded every minute — has made it difficult for the company to track rule violations. And the company’s historically lax approach to moderating extreme videos has led to a drumbeat of scandals, including accusations that the site has promoted disturbing videos to children and allowed extremist groups to organize on its platform. YouTube’s automated advertising system has paired offensive videos with ads from major corporations, prompting several advertisers to abandon the site.YouTube’s scale — more than 500 hours of new videos are uploaded every minute — has made it difficult for the company to track rule violations. And the company’s historically lax approach to moderating extreme videos has led to a drumbeat of scandals, including accusations that the site has promoted disturbing videos to children and allowed extremist groups to organize on its platform. YouTube’s automated advertising system has paired offensive videos with ads from major corporations, prompting several advertisers to abandon the site.
The kind of content that will be prohibited under YouTube’s new hate speech policies include videos that claim Jews secretly control the world, those that say women are intellectually inferior to men and therefore should be denied certain rights, or that suggest that the white race is superior to another race, a YouTube spokesman said. The kind of content that will be prohibited under YouTube’s new hate speech policies includes videos that claim Jews secretly control the world, those that say women are intellectually inferior to men and therefore should be denied certain rights, or that suggest that the white race is superior to another race, a YouTube spokesman said.
Channels that post some hateful content, but that do not violate YouTube’s rules with the majority of their videos, may receive strikes under YouTube’s three-strike enforcement system, but would not be immediately banned.Channels that post some hateful content, but that do not violate YouTube’s rules with the majority of their videos, may receive strikes under YouTube’s three-strike enforcement system, but would not be immediately banned.
The company also said that channels that “repeatedly brush up against our hate speech policies,” but don’t violate them outright, would be removed from YouTube’s advertising program, which allows channel owners to share in the advertising revenue their videos generate.The company also said that channels that “repeatedly brush up against our hate speech policies,” but don’t violate them outright, would be removed from YouTube’s advertising program, which allows channel owners to share in the advertising revenue their videos generate.
In addition to tightening its hate speech rules, YouTube announced it would also tweak its recommendation algorithm, the automated software that shows users videos based on their interests and past viewing habits. This algorithm is responsible for more than 70 percent of overall time spent on YouTube, and has been a major engine for the platform’s growth. But it has also drawn accusations of leading users down rabbit holes filled with extreme and divisive content, in an attempt to keep them watching and drive up the site’s usage numbers.In addition to tightening its hate speech rules, YouTube announced it would also tweak its recommendation algorithm, the automated software that shows users videos based on their interests and past viewing habits. This algorithm is responsible for more than 70 percent of overall time spent on YouTube, and has been a major engine for the platform’s growth. But it has also drawn accusations of leading users down rabbit holes filled with extreme and divisive content, in an attempt to keep them watching and drive up the site’s usage numbers.
“If the hate and intolerance and supremacy is a match, then YouTube is lighter fluid,” said Rashad Robinson, president of the civil rights nonprofit Color of Change. “YouTube and other platforms have been quite slow to address the structure they’ve created to incentivize hate.”“If the hate and intolerance and supremacy is a match, then YouTube is lighter fluid,” said Rashad Robinson, president of the civil rights nonprofit Color of Change. “YouTube and other platforms have been quite slow to address the structure they’ve created to incentivize hate.”
In response to the criticism, YouTube announced in January that it would recommend fewer objectionable videos, such as those with 9/11 conspiracy theories and vaccine misinformation, a category it called “borderline content.” The YouTube spokesman said on Tuesday that the algorithm changes had resulted in a 50 percent drop in recommendations to such videos in the United States. He declined to share specific data about which videos YouTube considered “borderline.”In response to the criticism, YouTube announced in January that it would recommend fewer objectionable videos, such as those with 9/11 conspiracy theories and vaccine misinformation, a category it called “borderline content.” The YouTube spokesman said on Tuesday that the algorithm changes had resulted in a 50 percent drop in recommendations to such videos in the United States. He declined to share specific data about which videos YouTube considered “borderline.”
“Our systems are also getting smarter about what types of videos should get this treatment, and we’ll be able to apply it to even more borderline videos moving forward,” the company’s blog post said.“Our systems are also getting smarter about what types of videos should get this treatment, and we’ll be able to apply it to even more borderline videos moving forward,” the company’s blog post said.
Other social media companies have faced criticism for allowing white supremacist content. Facebook recently banned a slew of accounts, including that of Paul Joseph Watson, a contributor to the conspiracy theory website Infowars, and Laura Loomer, a far-right activist. Twitter bans violent extremist groups but allows some of their members to maintain personal accounts — for instance, the Ku Klux Klan was banned from Twitter last August, while its former leader, David Duke, remains on the service. Twitter is currently studying whether the removal of content is effective in stemming the tide of radicalization online. A Twitter spokesman declined to comment on the study.Other social media companies have faced criticism for allowing white supremacist content. Facebook recently banned a slew of accounts, including that of Paul Joseph Watson, a contributor to the conspiracy theory website Infowars, and Laura Loomer, a far-right activist. Twitter bans violent extremist groups but allows some of their members to maintain personal accounts — for instance, the Ku Klux Klan was banned from Twitter last August, while its former leader, David Duke, remains on the service. Twitter is currently studying whether the removal of content is effective in stemming the tide of radicalization online. A Twitter spokesman declined to comment on the study.
When Twitter banned the conspiracy theorist Alex Jones last year, Mr. Jones responded with a series of videos decrying the platform’s decision and drumming up donations from his supporters.When Twitter banned the conspiracy theorist Alex Jones last year, Mr. Jones responded with a series of videos decrying the platform’s decision and drumming up donations from his supporters.
YouTube’s ban of white supremacists could prompt a similar cycle of outrage and grievance, said Joan Donovan, the director of the Technology and Social Change Research Project at Harvard. The ban, she said, “presents an opportunity for content creators to get a wave of media attention, so we may see some particularly disingenuous uploads.”YouTube’s ban of white supremacists could prompt a similar cycle of outrage and grievance, said Joan Donovan, the director of the Technology and Social Change Research Project at Harvard. The ban, she said, “presents an opportunity for content creators to get a wave of media attention, so we may see some particularly disingenuous uploads.”
“I wonder to what degree will the removed content be amplified on different platforms, and get a second life?” Ms. Donovan added.“I wonder to what degree will the removed content be amplified on different platforms, and get a second life?” Ms. Donovan added.