NEWS30 June 2014

Emotion proves contagious on Facebook

News North America Privacy

US — Facebook’s data science team, in conjunction with US scientists, have conducted an experiment on nearly 700,000 of the social media network’s to demonstrate “emotional contagion”.

Res_4011896_facebook_458

The resulting paper, published in the Proceedings of the National Academy of Sciences in the United States of America, showed that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness, and without direct interaction between users.

Facebook altered the tone of 689,003 people’s Facebook feed, without their knowledge, by reducing the number of posts containing either positive or negative emotions and monitoring the tone of subsequent posts produced by those users. It was found that when people saw fewer positive expressions of emotion on their news feed, they in turn produced fewer positive posts and more negative posts. The opposite pattern emerged when users saw fewer expressions of negative emotion.

“These results suggest that the emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks,” the authors wrote.

The paper points out that the study was conducted using software that detects emotional language, meaning that no text was seen by researchers and, as such, was “consistent with Facebook’s data use policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research”. However, according to a story published on the Guardian news site, a number of researchers have condemned the experiment, claiming that it breaches the ethical guidelines for informed consent. The story quotes James Grimmelmann, professor of law at the University of Maryland, who said: “The study harmed participants because it changed their mood.

“This is bad, even for Facebook.”

@RESEARCH LIVE

5 Comments

10 years ago

This is an outrageous experiment and permission should have ben sought. What is the opinion of the MRS?

Like Report

10 years ago

As researchers ourselves, maybe we ought to think about it this way. You know that images, words, and sounds make people happy and sad. When YOU conduct research on packages, aisle design, brand names, etc, you are knowingly affecting people's emotions and making them happy and unhappy by the things you present to them. Do YOU state in your consent forms that the project may make them more or less happy? Also to consider, Facebook did not create positive or negative comments. They simply showed or didn't show which positive or negative comments your friends intended you to see. Your friends wrote those negative things. Your friends wanted you to sympathize with them. Would you like Facebook to show you none of your friends' negativity so that you don't feel unhappy? I'll confess right now. I'm not sure whether I think what Facebook did is or isn't unethical. I'm still sorting it out.

Like Report

10 years ago

An interesting, but highly controversial experiment, with major ethical issues. See my IJMR Editor's blog on this: https://www.mrs.org.uk/ijmr_blog

Like Report

10 years ago

The public is becoming increasingly frustrated by organisations that hide behind small print when transacting with them. Such companies show a lack of respect for the people who are the lifeblood of their business and in the long run this will ultimately do them no favours as it damages their brand and reputation. Personal data can no longer be treated as a “free good”. Instead of pushing the boundaries of ethical standards, companies like Facebook should be at the forefront of working to raise them.

Like Report

10 years ago

No doubt in my mind: This research required informed consent from participants. FB didn't obtain it. And no, the ToS was/is not sufficient. This study was not only unethical, but potentially harmful. Sure, "probably" no one among the 689.003 users committed suicide as a result of this experiment, but however far-fetched that statement may appear, the answer is that we don't know.

Like Report