This article is from the source 'guardian' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at https://www.theguardian.com/technology/2016/nov/15/facebook-fake-news-us-election-trump-clinton

The article has changed 5 times. There is an RSS feed of changes available.

Version 3 Version 4
Facebook won't block fake news posts because it has no incentive, experts say Facebook won't block fake news posts because it has no incentive, experts say Facebook won't block fake news posts because it has no incentive, experts say
(about 4 hours later)
The challenge of fake and misleading news has come to the fore in the wake of the US presidential election. Facebook has bore the brunt of the criticism that it allowed misinformation to spread unfettered on its network, skewing people’s perceptions and possibly the outcome of the election – something CEO Mark Zuckerberg vehemently denies.The challenge of fake and misleading news has come to the fore in the wake of the US presidential election. Facebook has bore the brunt of the criticism that it allowed misinformation to spread unfettered on its network, skewing people’s perceptions and possibly the outcome of the election – something CEO Mark Zuckerberg vehemently denies.
As pressure on Zuckerberg and Facebook intensifies, the social network has promised to do more to eliminate hoaxes and, like Google, has blocked fake news sites from its ad network. The latter should strangle websites that deliberately publish misleading content by cutting off their advertising revenue.As pressure on Zuckerberg and Facebook intensifies, the social network has promised to do more to eliminate hoaxes and, like Google, has blocked fake news sites from its ad network. The latter should strangle websites that deliberately publish misleading content by cutting off their advertising revenue.
Yet despite these gestures, Facebook is unlikely to explore the many options available to it because it simply has very little motivation to do so, experts believe.Yet despite these gestures, Facebook is unlikely to explore the many options available to it because it simply has very little motivation to do so, experts believe.
“Although Mark Zuckerberg is being polite about it, there’s absolutely no way that Facebook will start preventing people from sharing what they want to share. That’s the core idea of the site,” said writer and professor Clay Shirky, who studies social networks.“Although Mark Zuckerberg is being polite about it, there’s absolutely no way that Facebook will start preventing people from sharing what they want to share. That’s the core idea of the site,” said writer and professor Clay Shirky, who studies social networks.
Facebook’s business model relies on people clicking, sharing and engaging with content – photos, memes, opinions, news and gossip – regardless of veracity. “People trade untrue stories that encapsulate things they believe about the world all the time,” he said. “Facebook is in the business of letting people share stuff they are interested in.”Facebook’s business model relies on people clicking, sharing and engaging with content – photos, memes, opinions, news and gossip – regardless of veracity. “People trade untrue stories that encapsulate things they believe about the world all the time,” he said. “Facebook is in the business of letting people share stuff they are interested in.”
Preventing any of that sharing interferes with core user behavior. “People share stuff because their mom might like it. My mom likes the pope, she likes Trump so she’ll be pleased that the pope supports Trump,” he said, in reference to a widely shared piece of “news” that falsely claimed the head of the Catholic church endorsed Donald Trump.Preventing any of that sharing interferes with core user behavior. “People share stuff because their mom might like it. My mom likes the pope, she likes Trump so she’ll be pleased that the pope supports Trump,” he said, in reference to a widely shared piece of “news” that falsely claimed the head of the Catholic church endorsed Donald Trump.
People don’t feel duped. We love bedtime stories. We don’t want someone telling us our cherished beliefs are falsePeople don’t feel duped. We love bedtime stories. We don’t want someone telling us our cherished beliefs are false
It doesn’t make sense for Facebook to apply traditional news publishing values such as verification to the network where a Pepe the Frog meme can carry as much currency as a New York Times op-ed. Facebook is only motivated to censor content when it makes users unhappy, which is why it efficiently polices content it deems to contain nudity, violence and harassment.It doesn’t make sense for Facebook to apply traditional news publishing values such as verification to the network where a Pepe the Frog meme can carry as much currency as a New York Times op-ed. Facebook is only motivated to censor content when it makes users unhappy, which is why it efficiently polices content it deems to contain nudity, violence and harassment.
But don’t people want to avoid being duped? “People who are told the pope supports their candidate don’t feel duped,” Shirky said. “We love bedtime stories. We don’t want someone telling us which of our cherished beliefs are false.”But don’t people want to avoid being duped? “People who are told the pope supports their candidate don’t feel duped,” Shirky said. “We love bedtime stories. We don’t want someone telling us which of our cherished beliefs are false.”
He highlights the absurdity of the task with a thought experiment. “Imagine, for a moment, Facebook adding a line to every story about Christianity saying there’s no scientific evidence that Jesus came back from the dead.”He highlights the absurdity of the task with a thought experiment. “Imagine, for a moment, Facebook adding a line to every story about Christianity saying there’s no scientific evidence that Jesus came back from the dead.”
Shirky acknowledges the dilemma of websites that purposefully create fake news to game the system for clicks and advertising revenue. However, he places the onus on the source of the story – the website creating the content – rather than the place it is being shared, ie Facebook.Shirky acknowledges the dilemma of websites that purposefully create fake news to game the system for clicks and advertising revenue. However, he places the onus on the source of the story – the website creating the content – rather than the place it is being shared, ie Facebook.
Facebook is an example of mass amateurization – millions of amateur publishers sharing content. “It simply cannot behave like a professional platform,” he said. “And yet if society doesn’t have a place that polices true stories, it will be a terrible loss for us.”Facebook is an example of mass amateurization – millions of amateur publishers sharing content. “It simply cannot behave like a professional platform,” he said. “And yet if society doesn’t have a place that polices true stories, it will be a terrible loss for us.”
How to spot fake newsHow to spot fake news
If Facebook won’t tackle the problem head-on, who will?If Facebook won’t tackle the problem head-on, who will?
The Trust Project thinks it has some of the answers. Launched in October 2014, the coalition of more than 60 news media outlets (including the Guardian), as well as academics and social networks, has tried to restore the trusted role of the press in civic life. The coalition aims to establish clear guidelines and trust metrics that could help both consumers and technology companies – through their ranking algorithms – by giving more weight to higher quality sources.The Trust Project thinks it has some of the answers. Launched in October 2014, the coalition of more than 60 news media outlets (including the Guardian), as well as academics and social networks, has tried to restore the trusted role of the press in civic life. The coalition aims to establish clear guidelines and trust metrics that could help both consumers and technology companies – through their ranking algorithms – by giving more weight to higher quality sources.
“In today’s burgeoning and chaotic news ecosystem, it is difficult to parse truth from falsehood, wisdom from spin. Legacy newspapers, digital media ventures, sponsored content and social media clamor for our attention,” the project’s leaders, Richard Gingras and Sally Lehrman, said when it launched.“In today’s burgeoning and chaotic news ecosystem, it is difficult to parse truth from falsehood, wisdom from spin. Legacy newspapers, digital media ventures, sponsored content and social media clamor for our attention,” the project’s leaders, Richard Gingras and Sally Lehrman, said when it launched.
“We have seen a decline in trust in the media over a period of decades and the polarization across what types of media are trusted,” Lehrman told the Guardian. “There’s very high trust of Fox News among conservatives who say they don’t trust the media in general. Liberals are more likely to trust the news, particularly NPR, CNN and the New York Times,” she added, citing the Reuters Institute’s 2015 digital news report.“We have seen a decline in trust in the media over a period of decades and the polarization across what types of media are trusted,” Lehrman told the Guardian. “There’s very high trust of Fox News among conservatives who say they don’t trust the media in general. Liberals are more likely to trust the news, particularly NPR, CNN and the New York Times,” she added, citing the Reuters Institute’s 2015 digital news report.
“Fake news is part of the problem, but there’s also news that’s poorly sourced and produced, as well as advertising and propaganda. We are trying to help people identify the difference.”“Fake news is part of the problem, but there’s also news that’s poorly sourced and produced, as well as advertising and propaganda. We are trying to help people identify the difference.”
Working with news organizations across the US and Europe, as well as interviews with the public, the Trust Project compiled a priority list of possible trust indicators. These include getting news organizations to meet a set of best practices including verification, having an ethics policy, diversity policy and revealing ownership structure and funding sources.Working with news organizations across the US and Europe, as well as interviews with the public, the Trust Project compiled a priority list of possible trust indicators. These include getting news organizations to meet a set of best practices including verification, having an ethics policy, diversity policy and revealing ownership structure and funding sources.
Other indicators include author biographies covering areas of expertise and experience; citations and references; and labels to indicate whether a piece is news, analysis, opinion or advertising; whether a story contains original reporting; whether it includes diverse voices; whether there’s a way for members of the public to provide feedback to the newsroom and whether reporters are local to the news event.Other indicators include author biographies covering areas of expertise and experience; citations and references; and labels to indicate whether a piece is news, analysis, opinion or advertising; whether a story contains original reporting; whether it includes diverse voices; whether there’s a way for members of the public to provide feedback to the newsroom and whether reporters are local to the news event.
These indicators should be useful to both members of the public and the algorithms that rank and distribute news content, including those developed by Facebook, Twitter and Google.These indicators should be useful to both members of the public and the algorithms that rank and distribute news content, including those developed by Facebook, Twitter and Google.
The Trust Project has organized a hackathon at the end of November to see if these indicators can be incorporated into newsroom workflow and easily surfaced on distribution platforms.The Trust Project has organized a hackathon at the end of November to see if these indicators can be incorporated into newsroom workflow and easily surfaced on distribution platforms.
“We’ve been talking to social media including Facebook, Twitter and LinkedIn to get them on board, and they’ve all expressed a lot of interest. We’d like to see how we can work together to implement and find ways to apply the indicators,” Lehrman said.“We’ve been talking to social media including Facebook, Twitter and LinkedIn to get them on board, and they’ve all expressed a lot of interest. We’d like to see how we can work together to implement and find ways to apply the indicators,” Lehrman said.
Of course, Facebook would still have to incorporate these new signals into its algorithm, but it wouldn’t have to make individual editorial judgements per se (even though it’s already doing this in some cases).Of course, Facebook would still have to incorporate these new signals into its algorithm, but it wouldn’t have to make individual editorial judgements per se (even though it’s already doing this in some cases).
If we can stop spam, why not fake news?If we can stop spam, why not fake news?
But what’s in it for Facebook? Investor and writer Om Malik thinks that Facebook might be motivated to change to avoid looking bad.But what’s in it for Facebook? Investor and writer Om Malik thinks that Facebook might be motivated to change to avoid looking bad.
“This is a company that talks about artificial intelligence and the idea that it can’t deal with fake news shows it’s not intelligent at all. It’s super dumb,” he said. “If I am running a platform there’s a huge difference between the Guardian, the New York Times and fake sites spreading bullshit.”“This is a company that talks about artificial intelligence and the idea that it can’t deal with fake news shows it’s not intelligent at all. It’s super dumb,” he said. “If I am running a platform there’s a huge difference between the Guardian, the New York Times and fake sites spreading bullshit.”
Facebook, in his view, should treat fake news in the same way that email providers treat spam. “We keep getting spam in email and yet we are able to stop it. There have to be solutions to spammy information inside platforms like Facebook.”Facebook, in his view, should treat fake news in the same way that email providers treat spam. “We keep getting spam in email and yet we are able to stop it. There have to be solutions to spammy information inside platforms like Facebook.”
Malik suggests that Facebook is commercially motivated to keep fake news on the platform. “As long as people are engaged and stay on the network, it works for them.”Malik suggests that Facebook is commercially motivated to keep fake news on the platform. “As long as people are engaged and stay on the network, it works for them.”
However, as demonstrated by a YouGov poll, people do care about the spread of false information on Facebook. Some 72% of people said that Facebook should filter out fake news stories and hoaxes.However, as demonstrated by a YouGov poll, people do care about the spread of false information on Facebook. Some 72% of people said that Facebook should filter out fake news stories and hoaxes.
In the long run, the spread of misinformation could erode trust in the entire system. “They can’t wash their hands and say, ‘We don’t want to interfere’,” Malik said. “Why should we then believe any video stream or any brand that appears on Facebook?”In the long run, the spread of misinformation could erode trust in the entire system. “They can’t wash their hands and say, ‘We don’t want to interfere’,” Malik said. “Why should we then believe any video stream or any brand that appears on Facebook?”
Despite the stern words, Malik thinks that the fake news fiasco is a temporary blip, though he says he welcomes the pressure being applied by the media trying to encourage it to be more thoughtful about the quality and influence of the content shared on its platform.Despite the stern words, Malik thinks that the fake news fiasco is a temporary blip, though he says he welcomes the pressure being applied by the media trying to encourage it to be more thoughtful about the quality and influence of the content shared on its platform.
For Shirky, the problem is much bigger than Facebook and is actually about the way that the internet challenges traditional institutions – in this case the media. In this way Facebook’s problem, he says, reflects some of the challenges Airbnb faces.For Shirky, the problem is much bigger than Facebook and is actually about the way that the internet challenges traditional institutions – in this case the media. In this way Facebook’s problem, he says, reflects some of the challenges Airbnb faces.
“When an institution lasts a long time, society layers a whole bunch of rules and regulations on top of that institution. When the institution changes, those rules and regulations break,” he said, pointing to Airbnb’s problems dealing with racism among its hosts. “We allow individual homeowners to say who can and can’t stay in their home. We require hotels to offer accommodations to people with no regard to race. So what happens when you build a hotel out of individual hotel owners?”“When an institution lasts a long time, society layers a whole bunch of rules and regulations on top of that institution. When the institution changes, those rules and regulations break,” he said, pointing to Airbnb’s problems dealing with racism among its hosts. “We allow individual homeowners to say who can and can’t stay in their home. We require hotels to offer accommodations to people with no regard to race. So what happens when you build a hotel out of individual hotel owners?”
Similarly, Facebook’s current structural challenge is misinformation. “You can’t simply take the rules of the old institution and slap it onto news ones. This requires a deep renegotiation.Similarly, Facebook’s current structural challenge is misinformation. “You can’t simply take the rules of the old institution and slap it onto news ones. This requires a deep renegotiation.
“Facebook absorbed the news media and now we want them to behave like editors. It’s impossible to enforce.”“Facebook absorbed the news media and now we want them to behave like editors. It’s impossible to enforce.”