Did Facebook’s mood experiment cross the line?

In early 2012, Facebook altered the news feed on the pages of nearly 700,000 of its users to explore the impact of negative and positive comments on moods. With the study released, an outcry has erupted because members had not given permission to "manipulate" their feeds.

The one-week experiment, done anonymously, found that lowering the number of positive comments in a news feed led to more negative content and vice versa. The study was published in the Proceedings of the National Academy of Sciences.

Facebook apologized, but argued that users consent to such research as a condition of using the service.

While Facebook had some sympathizers, many online commentators felt Facebook’s members were being used as "guinea pigs" or "lab rats."

Slate.com called the experiment "unethical" and said, "Facebook intentionally made thousands upon thousands of people sad."

Facebook regularly tweaks its news feed based on user data, including emphasizing certain "friends" or subjects, to improve the user experience. The number of ads or image sizes are also adjusted for similar reasons. The New York Times noted that Google and Yahoo likewise make adjustments based on how people interact with search results or news articles.

But while permission is generally given to the social media and search engine giants to use the reams of personal data they’re collecting for targeted advertising, it’s murky what other purposes they can use it for.

According to The Wall Street Journal, Facebook recently came up with an estimate of how many people were visiting Brazil for the World Cup based on its user data. In February, a list of the best places to be single in the U.S. was released. But many appear to agree with Brian Blau, a Gartner analyst, who told the Times, "Doing psychological testing on people crosses the line."

Similar to tweaking its user feeds, a Facebook spokesperson told the New York Business Journal that such research is done "to improve our services and to make the content people see on Facebook as relevant and engaging as possible." Any impact on people’s moods was also said to be extremely minimal.

At an event late last week in New Dehli, Sheryl Sandberg, chief operating officer, again apologized, saying the research was "poorly communicated," and asserted user privacy was paramount, according to the Guardian.

"We take privacy and security at Facebook really seriously because that is something that allows people to share," Ms. Sandberg said.

Discussion Questions

Did Facebook go too far in its use of user data for its mood experiment? What rules or restrictions should guide the use of personal data for research purposes?

Poll

28 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Dr. Stephen Needel
Dr. Stephen Needel
9 years ago

We are constantly being manipulated (or people are attempting to manipulate us) on the internet. Amazon recommending items is an overt act of manipulation. Reddit letting its users decide what is and is not important is a self-selected form of manipulation. Putting on my scientist hat, I don’t have a problem with this.

I find it interesting that everyone is ignoring the finding that the impact of the study variable was trivial. If Facebook is supposed to be driving society, as many claim, we would have expected a much bigger impact.

Mel Kleiman
Mel Kleiman
9 years ago

First reaction YES. Follow up reaction NO.

How did I get to the NO? Well, if they had not let us know that they had done the research and what they found out, they could have just used what they found out to manipulate the viewer. By those who conducted the study letting us know what they found out, users can gain control over their own thinking.

Without the research, someone would have—or had already—figured out what was happening, but the consumer would have never known.

Roger Saunders
Roger Saunders
9 years ago

Stating the obvious, privacy will be one of the top issues which companies have to address in the next three years. If they hope to keep the bureaucrats in Washington out of their hair, as well as hope to keep the trial lawyers out of the way, they would be wise to move quickly to have privacy practices in place.

That being said, none among us run the business to make the attorneys happy. The focus is on the customer. The consumer is the trusted partner. Facebook’s actions broached that trust. They should commit to the point that they will not repeat it.

Paula Rosenblum
Paula Rosenblum
9 years ago

The problem with “crossing the line” is that it MOVES the line. That means, next time you can do a little bit more, and then a little bit more after that.

There’s an old saying in retail, “It’s better to be forgiven than to ask permission.” Actually, in this case, it’s not. It’s one thing to say “Okay, I recognize what I write here becomes part of my personal record.” It’s one thing to create ads that tweak our emotions in one way or another (Happiness=Coke?), but it’s quite another to learn someone has played with your emotions to “see what happens.”

The end of that long and slippery slope has the potential to be—evil—I have no other word for it. Social engineering is disturbing. Facebook could be kill the goose that’s laying the golden egg—moving us all elsewhere.

It’s not a practice Ms. Sandberg and friends should lean into.

Mohamed Amer
Mohamed Amer
9 years ago

Absolutely crossed the line. I am surprised that researchers from Cornell and UC San Francisco were also involved. Academics doing research on people must gain approval from an Institutional Review Board (an ethics board) and that did not happen, at least at Cornell.

Just because you can doesn’t mean it is “right” to do or that you should do. Technology is expanding the possibility horizon, but also creating unanticipated ethical issues. This is a case in which economic gain is taken to trump ethics.

We’re not lab rats and although some would argue that all of marketing is legalized manipulation, that is very different from being manipulated without explicit or implicit consent.

Ed Rosenbaum
Ed Rosenbaum
9 years ago

My problem with Facebook is, as the article stated, this was poorly communicated. Thinking back on the recent past, it seems to be that Facebook has had several of these “poorly-communicated” moments. Could it possibly be that they think they are above the rules of common decency when it comes to this sort of thing?

Warren Thayer
Warren Thayer
9 years ago

A sneaky move by Facebook at a time when, from past sneakiness, they should be trying very hard for transparency. As for rules or restrictions that should guide them, they might test out the idea of “common sense.” Otherwise, the attorneys will ultimately guide them.

Tony Orlando
Tony Orlando
9 years ago

Unfortunately this happens all the time without our knowledge. Marketers are using online information to push their agenda for all kinds of things they want to sell you, and the online hackers are feasting on this as well. I don’t think there is much we can do except watch our wallets and credit cards, as someone legit or not wants a piece of you. We can not put the technology back into the bottle, so be careful.

David Livingston
David Livingston
9 years ago

Facebook has to find ways to maximize income and needs to experiment from time to time. Since I didn’t notice or didn’t care, then its makes it perfectly alright. Its all in the fine print. As long as its part of the user agreement it should be ok.

Tom Redd
Tom Redd
9 years ago

Facebook. Too far? Not in their minds. In Facebook users’ minds? I estimate that a majority of them are not even aware of this psych experiment that was run on them. The average Facebooker is too obsessed taking selfies and sharing more and more of their personal life on a wide-open, un-secure band stand.

I am very anti-Facebook—oh really? No matter the rules or guidelines Facebook, the company has the attitude, “we will do what we want with our users.”

I have advised my family members for years to kill their Facebook accounts. The potential for negative ramifications is huge, but my kids just keep spilling their lives into Facebook. The best way to do this is to kill your Facebook account and advise at least seven other people to do the same. Two or three might.

If I were a retailer I would never have a Facebook site. Someday it will be another problem versus being a promotion issue.

Carole Meagher
Carole Meagher
9 years ago

Yes, because the fundamental ethic in research is the consent of the participant!

Ryan Mathews
Ryan Mathews
9 years ago

Facebook didn’t violate the letter of its user agreement but it did violate its spirit—at least in the minds of a good number of members.

The real problem is that, now that more users are aware that by virtue of opening an account they de facto agree to manipulation, they will begin to have additional doubts about the network’s integrity.

As far as rules go—users have to learn to READ those pesky agreements rather than automatically clicking “Click to Accept.” Yet another case of caveat emptor.

Jason Williams
Jason Williams
9 years ago

No. I sign up for social messaging services such as Facebook, Twitter and LinkedIn knowing full well that these companies will use my data. If you don’t want your data used in these “experiments,” don’t use social media.

Rev. Tim George
Rev. Tim George
9 years ago

When you have a Facebook page, you do not pay for it because you are the product. Something is given up by doing so, but I do not believe Facebook crossed the line on their experiment. We as consumers are always being used as targets, so that companies know how to advertise to us. So it should not surprise us that such an experiment would be conducted on us.

We can learn a lot of things about ourselves by means of this experiment, and should accept the good that comes with the bad. One must weigh the consequences of having a Facebook account over the benefits of having one in such cases. We love to keep up with family and friends; therefore what better way than to know that we become targets for advertising dollars. I do not have a problem with this practice, and believe it to be more positive than negative.

Joan Treistman
Joan Treistman
9 years ago

Excuse me—and how exactly do we know the impact was trivial? I believe the judgment was made based on what was overtly visible on Facebook. So how did those manipulated users deal with the other part of their daily lives? Are we to assume that the impact was “trivial” there as well? And for those of you who truly believe it was trivial on Facebook and in the other day parts—were you part of the experiment? How do you know it was trivial? There are people for whom even a small change of mood can throw them over the edge. I can’t believe so many people are willing to categorize any deliberate effort to change the mood of unsuspecting citizens trivial.

Li McClelland
Li McClelland
9 years ago

Without question they crossed the line and also in the process they broke final faith with some users/members who were already on the fence about how serious the personal privacy issues with Facebook might actually be. Adding the “research” clause to the user agreement after the fact is both unethical and evil and suggests that despite the “apology” they plan to continue to do more research.

I seriously doubt this was the only “experiment” so far, either—it’s just the one we’ve found out about.

Li McClelland
Li McClelland
9 years ago

I’m sorry for adding a second comment here, but it’s quite obvious that some other additional “research,” snooping, and selection criteria went into their deciding initially who the “nearly” 700,000 guinea pigs were going to be on whom to perform their experimentation and analysis. It would be nice to know much more about that process, as well.

Marge Laney
Marge Laney
9 years ago

Yes, absolutely. Even though you sign away a lot of rights to your data privacy when you sign on to Facebook, content experimentation for psychological analysis without consent is wrong and a case can and is being made that it is illegal.

Ms. Sandberg’s statement that Facebook’s top priority is privacy and security seems incongruous with its actions and an almost idiotic response. Trifling with people’s emotions and then, when caught, brushing it off because it really had little or no impact on people’s moods adds insult to injury.

Facebook needs to remember who they are and why people were attracted to them in the first place. Zuckerberg built the platform as a “trustworthy” way for one person to communicate with another. It has grown from that, but “trust” is still a prime ingredient to their success. Lose that trust and their members will find a platform that they can trust. It’s that simple.

Marie haines
Marie haines
9 years ago

What I find terrifying about this is the subtle manipulation of information. In war and politics the person who controls the media controls what information is available to the public. A simple change of phrase, the timed release of data, a careful selection of images, all of these things have and can influence public reaction.

Too many people impulsively react without consideration of the source of their information. “Everyone says, everyone knows, my neighbor says” are frequently quoted sources.

So while a simple social experiment by Facebook appears harmless on the surface, just think how powerful this manipulation of data could be in a politically unstable situation.

Gene Detroyer
Gene Detroyer
9 years ago

I am with some of my colleagues on this one. My first reaction was that they did cross the line. But then I realized; this is Facebook. Facebook is hardly a place where privacy is expected.

Then there was my third reaction—the study is fascinating. What a laboratory we have here. What other questions could we explore?

Raymond D. Jones
Raymond D. Jones
9 years ago

I think this was way overblown. It was basically a college psychology experiment. It was conducted on a very small percentage of Facebook users for a very short period of time and the impact was measured without harm to anyone.

Now clearly, Facebook has the capacity to manipulate users. However, this is true of all media. Doesn’t advertising attempt to manipulate perceptions. Is social media exempt from this? In fact, social media is less regulated than actual advertising. It is up to the individual to evaluate the content. This seems to be the bigger issue than some psych experiment.

George-Marie Glover
George-Marie Glover
9 years ago

Privacy is defined as “the state or condition of being free from being observed or disturbed by other people.”

The very nature of Facebook and social media in general is not private. That’s why it’s called “social media.”

To expect that these platforms will not be used to manipulate you is as unrealistic as expecting people to never gossip and always keep secrets. It goes against human nature.

Craig Sundstrom
Craig Sundstrom
9 years ago

Facebook stumbles; it issues an “apology” that shows it’s just as clueless as it was in the previous stumble; rinse; repeat….

Is anyone surprised here that a company run by a group of self-important twenty-somethings constantly runs into ethical problems? And more importantly, does anyone think “rules or restrictions” are either (1) likely to be enacted, or (2) meaningfully enforced (even) if they are? The best solution is for people who don’t like Facebook—for any of the dozens of reasons that exist—to minimize their use of it…or stop altogether.

Larry Negrich
Larry Negrich
9 years ago

I find it troubling that the company addressed their apology (at least in the quote in this article) at user privacy and data security. Privacy is certainly an aspect of this story. But the bigger issue is whether a company can conduct social experiments without the consent of the participants. I would say they do not have that right.

Karen S. Herman
Karen S. Herman
9 years ago

My concerns here are, firstly, the “truth” of my future user experiences on Facebook, and secondly, a deeper and greater concern of authenticity.

At least Google lets you know that your data is mined to improve your user experience and, in the end, I find that it is for my benefit and makes my time on Google faster and better.

In this case, the actions by Facebook were self-serving. Facebook users were manipulated, but not for their benefit. This experiment was for Facebook’s benefit. This is the fact that does not sit well with me.

With my social site activities, authenticity is important to me and I’d like to see Facebook publish a notice in the future and advise of such experiments and tell me a few details, such as the time frame of the experiment and why the research is beneficial.

One thing is for sure. Facebook certainly knows how to reach its users.

Tim Moerke
Tim Moerke
9 years ago

I don’t think Facebook’s experiment and advertising based on mined data are comparable activities. When we we see or hear an ad, we’re aware that it is taking place and that someone is trying to elicit a response from us (buying something, supporting a cause, etc). But that was not so with what Facebook did, where users were not even aware of the manipulation. An IRB would never approve the data collection methods used in this experiment before it was conducted (which is why one was only consulted about the analysis after the fact).

Camille P. Schuster, Ph.D.
Camille P. Schuster, Ph.D.
9 years ago

Using data in an aggregate fashion from users is something that might be the kind of research a reasonable person might expect Facebook to conduct. Deliberately manipulating the kind of information people see without their knowledge of overt participation may be part of the information in the agreement, but is not something people expect or like, as evidenced by the backlash. Asking people if they are willing to participate, especially if they receive value, is likely to get a positive response. Offending consumers is not a way to show respect for your consumers.

Janet Dorenkott
Janet Dorenkott
9 years ago

FB crossed the line. And I say that as a stock holder. It’s one thing to use information that was voluntarily given to do target marketing. It’s completely different to intentionally try to manipulate emotions. To say the impact was “extremely minimal” is ridiculous.

According to the NIH, an estimated 26.2 percent of Americans ages 18 and older suffer from a mental disorder. How do they know what actions someone might have taken as a result of their emotional manipulation?

Academia has also crossed the line here. Even those people who think it didn’t impact them have no way of knowing how it impacted others. If they get a pass, they will do it again and so will others.

BrainTrust