Facebook is facing criticism following the revelation that it secretly conducted an emotional manipulation study. The study took place during one week in January 2012 on nearly 700,000 users, without their knowledge.
The world's largest social media site collaborated with Cornell University and the University of California at San Francisco which published details of the research recently. The researchers manipulated information posted on users' home pages in order to find out if "exposure to emotions led people to change their own posting behaviours," reports the BBC.
The process of experimenting with "emotional contagion" involved Facebook's filtering of users' news feeds, including the flow of their comments, videos, pictures and web links shared by other users in their network. "Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks," the study findings state.
Concerns were raised over the way the research was carried out, without any user knowledge. But the experiment was in no way illegal, reports The Atlantic, as in Facebook's terms of service, users consent to give up their data for "data analysis, testing, [and] research" when they sign up to join the social media network.
However ethical questions remain:
"Let's call the Facebook experiment what it is: a symptom of a much wider failure to think about ethics, power and consent on platforms," said Kate Crawford in a Twitter post.
"I wonder if Facebook KILLED anyone with their emotion manipulation stunt. At their scale and with depressed people out there, it's possible," tweeted privacy activist Lauren Weinstein.
A report by The Guardian also says that Labour MP Jim Sheridan has called for a parliamentary investigation into such an intrusive study "This is extraordinarily powerful stuff and if there is not already legislation on this, then there should be, to protect people." Sheridan went on to say "They are manipulating material from people's personal lives and I am worried about the ability of Facebook and others to manipulate people's thoughts in politics or other areas. If people are being thought-controlled in this kind of way there needs to be protection and they at least need to know about it."
Leader of the study, Adam D.I. Kramer, posted a public apology on his Facebook page on Sunday afternoon, saying: "I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused."
Kramer explained that the research was conducted as the team felt that it was "important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook." Kramer admitted that "In hindsight, the research benefits of the paper may not have justified all of this anxiety."
This research reminds me of the well known radiators and drains personalities idea. Making Facebook into a consistently negative news source for a large group of people for an experiment doesn't sound very enlightening or worthwhile.