Did Facebook’s mood experiment cross the line?
In early 2012, Facebook altered the news feed on the pages of nearly 700,000 of its users to explore the impact of negative and positive comments on moods. With the study released, an outcry has erupted because members had not given permission to "manipulate" their feeds.
The one-week experiment, done anonymously, found that lowering the number of positive comments in a news feed led to more negative content and vice versa. The study was published in the Proceedings of the National Academy of Sciences.
Facebook apologized, but argued that users consent to such research as a condition of using the service.
While Facebook had some sympathizers, many online commentators felt Facebook’s members were being used as "guinea pigs" or "lab rats."
Slate.com called the experiment "unethical" and said, "Facebook intentionally made thousands upon thousands of people sad."
Facebook regularly tweaks its news feed based on user data, including emphasizing certain "friends" or subjects, to improve the user experience. The number of ads or image sizes are also adjusted for similar reasons. The New York Times noted that Google and Yahoo likewise make adjustments based on how people interact with search results or news articles.
But while permission is generally given to the social media and search engine giants to use the reams of personal data they’re collecting for targeted advertising, it’s murky what other purposes they can use it for.
According to The Wall Street Journal, Facebook recently came up with an estimate of how many people were visiting Brazil for the World Cup based on its user data. In February, a list of the best places to be single in the U.S. was released. But many appear to agree with Brian Blau, a Gartner analyst, who told the Times, "Doing psychological testing on people crosses the line."
Similar to tweaking its user feeds, a Facebook spokesperson told the New York Business Journal that such research is done "to improve our services and to make the content people see on Facebook as relevant and engaging as possible." Any impact on people’s moods was also said to be extremely minimal.
At an event late last week in New Dehli, Sheryl Sandberg, chief operating officer, again apologized, saying the research was "poorly communicated," and asserted user privacy was paramount, according to the Guardian.
"We take privacy and security at Facebook really seriously because that is something that allows people to share," Ms. Sandberg said.
- Experimental evidence of massive-scale emotional contagion through social networks – Proceedings of the National Academy of Sciences
- Explanation by Adam Kramer, Facebook data scientist and co-author of the study – Facebook
- Facebook Didn’t Add "Research" Clause To User Agreement Until After Emotional Contagion Experiment – Animal New York
- Facebook’s Unethical Experiment – Slate
- How Researchers Classified Facebook Posts as ‘Happy’ or ‘Sad’ – The Wall Street Journal (sub. required)
- Furor Erupts Over Facebook’s Experiment on Users – The Wall Street Journal (sub. required)
- Facebook Tinkers With Users’ Emotions in News Feed Experiment, Stirring Outcry – The New York Times (tiered sub.)
- Facebook Ran A Huge Psychological Experiment On Users And Manipulated The Emotions Of More Than 600,000 People – Business Insider
- Facebook apologizes for psychological experiments on users – The Guardian
Did Facebook go too far in its use of user data for its mood experiment? What rules or restrictions should guide the use of personal data for research purposes?