This article is from the source 'guardian' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at http://www.theguardian.com/technology/2014/jul/03/facebook-emotion-experiments-monika-bickert

The article has changed 2 times. There is an RSS feed of changes available.

Version 0 Version 1
Facebook policy head says emotion experiments were 'innovative' Facebook policy head says emotion experiments were 'innovative'
(about 2 months later)
Facebook’s head of policy, Monika Bickert, has claimed that the company's emotion experiments were “innovation” and a necessary part of the research needed for new features and developments.Facebook’s head of policy, Monika Bickert, has claimed that the company's emotion experiments were “innovation” and a necessary part of the research needed for new features and developments.
Bickert said that “most of the research that is done on Facebook” is “all about ‘how do we make this product better’", while speaking at the Aspen Ideas Festival on Tuesday.Bickert said that “most of the research that is done on Facebook” is “all about ‘how do we make this product better’", while speaking at the Aspen Ideas Festival on Tuesday.
“It’s concerning when we see legislation that could possibly stifle that sort of creativity and that innovation,” Bickert, in contrast to the apology offered on Wednesday by Sheryl Sandberg, Facebook’s chief operating officer.“It’s concerning when we see legislation that could possibly stifle that sort of creativity and that innovation,” Bickert, in contrast to the apology offered on Wednesday by Sheryl Sandberg, Facebook’s chief operating officer.
'Make sure we’re being transparent''Make sure we’re being transparent'
“That’s innovation. That’s the reason that when you look at Facebook or YouTube you’re always seeing new features. And that’s the reason if you have that annoying friend from high school that always posts pictures of their toddler every single day, that’s reason that you don’t see all those photos in your news feed,” said Bickert.“That’s innovation. That’s the reason that when you look at Facebook or YouTube you’re always seeing new features. And that’s the reason if you have that annoying friend from high school that always posts pictures of their toddler every single day, that’s reason that you don’t see all those photos in your news feed,” said Bickert.
Bickert admitted that Facebook’s handling of the research could have been clearer to users, saying that “it’s incumbent upon us to make sure we’re transparent about what we’re doing and that people understand exactly why we’re doing what we do.”Bickert admitted that Facebook’s handling of the research could have been clearer to users, saying that “it’s incumbent upon us to make sure we’re transparent about what we’re doing and that people understand exactly why we’re doing what we do.”
“What I think we have to do in the future is make sure we’re being transparent, both to regulators and to people using the product about exactly what we’re doing,” she said.“What I think we have to do in the future is make sure we’re being transparent, both to regulators and to people using the product about exactly what we’re doing,” she said.
'Emotional contagion''Emotional contagion'
The experiments caused outrage with users and researchers alike over the weekend, when a scientific paper based on the research that involved nearly 700,000 users over one week in 2012.The experiments caused outrage with users and researchers alike over the weekend, when a scientific paper based on the research that involved nearly 700,000 users over one week in 2012.
Facebook hid "a small percentage" of emotional words from peoples' news feeds, without their knowledge, to test what effect that had on the statuses or "likes" that they then posted or reacted to.Facebook hid "a small percentage" of emotional words from peoples' news feeds, without their knowledge, to test what effect that had on the statuses or "likes" that they then posted or reacted to.
The company's researchers discovered that tweaking the content of peoples' "news feeds" created an "emotional contagion" across the social network, by which people who saw one emotion being expressed would themselves express similar emotions.The company's researchers discovered that tweaking the content of peoples' "news feeds" created an "emotional contagion" across the social network, by which people who saw one emotion being expressed would themselves express similar emotions.
The research was conducted without the knowledge of users and has caused controversy as to whether it crossed an ethical line, with paper’s publishers conducting an investigation. The UK’s data protection watchdog, the Information Commissioner’s Office, is looking into the situation to see if any UK laws were broken. The research was conducted without the knowledge of users and has caused controversy as to whether it crossed an ethical line, with paper’s publishers conducting an investigation. The UK’s data protection watchdog, the Information Commissioner’s Office, is looking into the situation to see if any UK laws were broken.
If Facebook can tweak our emotions and make us vote, what else can it do? • If Facebook can tweak our emotions and make us vote, what else can it do?