NEWS30 June 2014
All MRS websites use cookies to help us improve our services. Any data collected is anonymised. If you continue using this site without accepting cookies you may experience some performance issues. Read about our cookies here.
All MRS websites use cookies to help us improve our services. Any data collected is anonymised. If you continue using this site without accepting cookies you may experience some performance issues. Read about our cookies here.
Insight & Strategy
Columnists
Impact magazine is a quarterly publication for MRS members. You can access Impact content on this website.
US — Facebook’s data science team, in conjunction with US scientists, have conducted an experiment on nearly 700,000 of the social media network’s to demonstrate “emotional contagion”.
The resulting paper, published in the Proceedings of the National Academy of Sciences in the United States of America, showed that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness, and without direct interaction between users.
Facebook altered the tone of 689,003 people’s Facebook feed, without their knowledge, by reducing the number of posts containing either positive or negative emotions and monitoring the tone of subsequent posts produced by those users. It was found that when people saw fewer positive expressions of emotion on their news feed, they in turn produced fewer positive posts and more negative posts. The opposite pattern emerged when users saw fewer expressions of negative emotion.
“These results suggest that the emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks,” the authors wrote.
The paper points out that the study was conducted using software that detects emotional language, meaning that no text was seen by researchers and, as such, was “consistent with Facebook’s data use policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research”. However, according to a story published on the Guardian news site, a number of researchers have condemned the experiment, claiming that it breaches the ethical guidelines for informed consent. The story quotes James Grimmelmann, professor of law at the University of Maryland, who said: “The study harmed participants because it changed their mood.
“This is bad, even for Facebook.”
Newsletter
Sign up for the latest news and opinion.
You will be asked to create an account which also gives you free access to premium Impact content.
If the status quo for buying research does not work, what are the alternatives? Jack Miles examines some of the ide… https://t.co/mVoEiK5Yma
Forty1 designs behavioural tool for employee experience https://t.co/0mE35zMpJP #mrx #marketresearch
Consumer confidence rises but personal finance outlook stagnates https://t.co/w2soSOAKSG #mrx #marketresearch
The world's leading job site for research and insight
Resources Group
Senior Account Manager (Research / PM) – Health
£38,995 + Bens
Hasson Associates
Traffic Manager
£45000–50000
Resources Group
Qualitative Research Manager – Cultural & Brand Insights (Global Insights)
up to circa £40,000 + Bens
Featured company
Town/Country: London
Tel: +44 (0)20 7490 7888
Kudos Research are leading providers of premium quality UK and International Telephone Data-Collection. Specialising in hard to reach B2B and Consumer audiences, we achieve excellent response rates and provide robust, actionable, verbatim-rich data. Methodologies include CATI, . . .
Related Articles
Thomas Punt
9 years ago
This is an outrageous experiment and permission should have ben sought. What is the opinion of the MRS?
Annie Pettit
9 years ago
As researchers ourselves, maybe we ought to think about it this way. You know that images, words, and sounds make people happy and sad. When YOU conduct research on packages, aisle design, brand names, etc, you are knowingly affecting people's emotions and making them happy and unhappy by the things you present to them. Do YOU state in your consent forms that the project may make them more or less happy? Also to consider, Facebook did not create positive or negative comments. They simply showed or didn't show which positive or negative comments your friends intended you to see. Your friends wrote those negative things. Your friends wanted you to sympathize with them. Would you like Facebook to show you none of your friends' negativity so that you don't feel unhappy? I'll confess right now. I'm not sure whether I think what Facebook did is or isn't unethical. I'm still sorting it out.
Peter Mouncey
9 years ago
An interesting, but highly controversial experiment, with major ethical issues. See my IJMR Editor's blog on this: https://www.mrs.org.uk/ijmr_blog
Jane Frost, chief executive, MRS
9 years ago
The public is becoming increasingly frustrated by organisations that hide behind small print when transacting with them. Such companies show a lack of respect for the people who are the lifeblood of their business and in the long run this will ultimately do them no favours as it damages their brand and reputation. Personal data can no longer be treated as a “free good”. Instead of pushing the boundaries of ethical standards, companies like Facebook should be at the forefront of working to raise them.
Dan Kvistbo
9 years ago
No doubt in my mind: This research required informed consent from participants. FB didn't obtain it. And no, the ToS was/is not sufficient. This study was not only unethical, but potentially harmful. Sure, "probably" no one among the 689.003 users committed suicide as a result of this experiment, but however far-fetched that statement may appear, the answer is that we don't know.
And mentoring .. don’t forget the mentoring https://t.co/mteLIw5j1F
The post-demographic consumerism trend means segments such age are often outdated, from @trendwatching #TrendSemLON
5 Comments
Thomas Punt
9 years ago
This is an outrageous experiment and permission should have ben sought. What is the opinion of the MRS?
Like Reply Report
Annie Pettit
9 years ago
As researchers ourselves, maybe we ought to think about it this way. You know that images, words, and sounds make people happy and sad. When YOU conduct research on packages, aisle design, brand names, etc, you are knowingly affecting people's emotions and making them happy and unhappy by the things you present to them. Do YOU state in your consent forms that the project may make them more or less happy? Also to consider, Facebook did not create positive or negative comments. They simply showed or didn't show which positive or negative comments your friends intended you to see. Your friends wrote those negative things. Your friends wanted you to sympathize with them. Would you like Facebook to show you none of your friends' negativity so that you don't feel unhappy? I'll confess right now. I'm not sure whether I think what Facebook did is or isn't unethical. I'm still sorting it out.
Like Reply Report
Peter Mouncey
9 years ago
An interesting, but highly controversial experiment, with major ethical issues. See my IJMR Editor's blog on this: https://www.mrs.org.uk/ijmr_blog
Like Reply Report
Jane Frost, chief executive, MRS
9 years ago
The public is becoming increasingly frustrated by organisations that hide behind small print when transacting with them. Such companies show a lack of respect for the people who are the lifeblood of their business and in the long run this will ultimately do them no favours as it damages their brand and reputation. Personal data can no longer be treated as a “free good”. Instead of pushing the boundaries of ethical standards, companies like Facebook should be at the forefront of working to raise them.
Like Reply Report
Dan Kvistbo
9 years ago
No doubt in my mind: This research required informed consent from participants. FB didn't obtain it. And no, the ToS was/is not sufficient. This study was not only unethical, but potentially harmful. Sure, "probably" no one among the 689.003 users committed suicide as a result of this experiment, but however far-fetched that statement may appear, the answer is that we don't know.
Like Reply Report