This article is from the source 'guardian' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at https://www.theguardian.com/technology/2018/nov/06/facebook-admits-it-has-not-done-enough-to-quell-hate-in-myanmar

The article has changed 5 times. There is an RSS feed of changes available.

Version 0 Version 1
Facebook admits it has not done enough to quell hate in Myanmar Facebook admits it has not done enough to quell hate in Myanmar
(about 3 hours later)
Facebook has said a human rights report it commissioned on its presence in Myanmar showed it had not done enough to prevent its social network from being used to incite violence. Facebook has admitted it did not do enough to prevent the incitement of violence and hate speech in Myanmar, after a report it commissioned concluded that it had become a platform for harmful and racially-inflammatory content.
The report by San Francisco-based nonprofit Business for Social Responsibility (BSR) recommended that Facebook more strictly enforce its content policies, increase engagement with both Myanmar officials and civil society groups and regularly release additional data about its progress in the country. The report by San Francisco-based nonprofit Business for Social Responsibility (BSR) found that, in Myanmar, “Facebook has become a means for those seeking to spread hate and cause harm, and posts have been linked to offline violence.”
“The report concludes that, prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more,” Alex Warofka, a Facebook product policy manager, said in a blog post. There are now 20 million Facebook users in Myanmar, and it is used by large numbers of people as their main source of news in the absence of a free media.
BSR also warned that Facebook must be prepared to handle a likely onslaught of misinformation during Myanmar’s 2020 elections, and new problems as use of its WhatsApp grows in Myanmar, according to the report, which Facebook released. But the report concluded that Facebook was being used by “bad actors” to spread hate speech, incite violence, and coordinate harm in Myanmar, echoing findings by civil society and tech groups, some who have been highlighting the issue to the social media giant for over four years.
Facebook said that it now has 99 Myanmar language specialists reviewing potentially questionable content. In addition, it has expanded use of automated tools to reduce distribution of violent and dehumanising posts while they undergo review. A large proportion of this hate speech has been directed towards the Rohingya, the Muslim minority in Myanmar.
In April, the Guardian reported that hate speech on Facebook in Myanmar had exploded during the Rohingya crisis. In April, the Guardian reported that hate speech on Facebook in Myanmar had exploded during the Rohingya crisis, which was caused by a crackdown by the military in Rahkine state in August 2017. Tens of thousands of Rohingya were killed, raped and assaulted, villages were razed to the ground and more than 700,000 Rohingya fled over the border to Bangladesh.
Facebook said it has begun correcting shortcomings. In the third quarter, the company said it “took action” on about 64,000 pieces of content that violated its hate speech policies. About 63% were identified by automated software, up from 52% in the prior quarter. The recent UN fact-finding mission to Myanmar, which concluded a genocide had taken place against the Rohingya in Rahkine, specifically singled out the role of Facebook in fanning the flames of anti-Muslim sentiment and violence.
Facebook has roughly 20 million users in Myanmar, according to BSR, which warned Facebook faces several unresolved challenges in Myanmar. Alex Warofka, a Facebook product policy manager, said in a blog post that the report demonstrated that “prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more.”
BSR said locating staff there, for example, could aid in Facebook’s understanding of how its services are used locally but said its workers could be targeted by the country’s military, which has been accused by the UN of ethnic cleansing of the Rohingya. The company said they were tackling the problem, this year hiring 100 native Myanmar speakers to review content. The company took action on around 64,000 pieces of content in Myanmar for violating hate speech policies in 2018. They also took down 18 accounts and 52 pages associated with figures in the Myanmar military who were named in the UN fact-finding report as being involved in the genocide and ethnic cleansing in Rahkine.
Reuters contributed to this report However, the BSR report made it clear that due to the “complex social and political context of Myanmar” the social media giant did not yet have the problem under control and there was still a “high likelihood” of hate speech being posted on Facebook in Myanmar.
The report said the consequences for the victims of this hate speech on Facebook, they said, was “severe, with lives and bodily integrity placed at risk from incitement to violence.”
One interviewee quoted in the report said: “Activists are being harassed, self-censorship exists, and activity on Facebook today is closing freedom of expression, rather than increasing it. One side is shutting down the other, and it is no longer a marketplace of ideas.”
In particular, the report highlighted the upcoming 2020 general elections in Myanmar as a cause for concern.
“Today’s challenging circumstances are likely to escalate in the run-up to the election, and Facebook would be well-served by preparing for multiple eventualities now,” the reports authors warned.
FacebookFacebook
MyanmarMyanmar
Social networkingSocial networking
South and Central AsiaSouth and Central Asia
newsnews
Share on FacebookShare on Facebook
Share on TwitterShare on Twitter
Share via EmailShare via Email
Share on LinkedInShare on LinkedIn
Share on PinterestShare on Pinterest
Share on Google+Share on Google+
Share on WhatsAppShare on WhatsApp
Share on MessengerShare on Messenger
Reuse this contentReuse this content