The estimated reading time for this post is 3 minutes
In January 2012, Facebook, along with researchers from Cornell University and University of California San Francisco, conducted an experiment with over half a million News Feeds to determine the emotional effects of posts on users. Facebook also wanted to determine whether users who saw an increased number of positive posts would be more engaged and less likely to quit the site.
The Experiment
The experiment occurred for one week in January 2012 when the Facebook News Feeds of 689,003 users were filtered to show posts with an increased proportion of positive or negative emotions. The filters only applied to News Feeds, users could see all of their friends’ posts by going to their pages.
[note]Facebook has always filtered New Feeds, it can’t show you every post from your friends in your News Feed, there simply are too many posts. Instead, your News Feed is filtered by Facebook to show you selected posts. How Facebook should filter News Feeds is now a hot topic.[/note]
From the study:
Because people’s friends frequently produce much more content than one person can view, the News Feed filters posts, stories, and activities undertaken by friends. News Feed is the primary manner by which people see content that friends share. Which content is shown or omitted in the News Feed is determined via a ranking algorithm that Facebook continually develops and tests in the interest of showing viewers the content they will find most relevant and engaging.
The study can be read in full here: Experimental evidence of massive-scale emotional contagion through social networks.
Results
The experiment showed that emotions can be contagious, spreading among friends through Facebook posts:
- Users who saw an increased proportion of positive posts responded by posting more positive posts and posting more frequently.
- Users who saw an increased proportion of negative posts responded by posting more negative posts and posting less frequently.
According to the study, “Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks.”
⇒ Also see, It’s Time to Spring Clean Your Facebook! for information about a similar report from Public Library of Science which showed the effect of positive and negative Facebook posts on emotions.
Controversy
A storm of controversy about this research has erupted with many being upset about how the study was conducted. Participants in the study weren’t expressly informed that News Feeds were being altered or that their emotions were being studied by Facebook. The users whose feeds were filtered were not specifically asked for consent nor notified that they were part of an experiment. Facebook has never revealed whose feeds were altered for the study.
Facebook claims that its Terms of Use covered research, granting informed consent from its users. The authors of the study say that their research: “was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.”
However, according to Forbes, Facebook’s Terms of Use did not cover research in January 2012 when the experiment was conducted. Consent to research was added to the Terms of Use four months ago: Facebook Added ‘Research’ To User Agreement 4 Months After Emotion Manipulation Study.
Further, Facebook has not revealed what the impact of the study has had upon our News Feeds. Perhaps our News Feeds have been filtered to omit more negative posts ever since the study was conducted. Perhaps Facebook’s News Feed algorithm didn’t change post-study. We just don’t know.
We may not care. USA Today is reporting that while people are upset over Facebook’s altering News Feeds without the informed consent of its users, no one seems to be quitting the site in a huff over the controversy: No one mad enough to quit Facebook over research study.
Poll
How do you feel about Facebook altering News Feeds for a social experiment? Vote in today’s Wonder of Tech poll and let us know your thoughts:
⇒ Learn how you can take charge of your News Feed, Fix Your Flooded Facebook Feed (Without Unfriending Anyone).
Your Thoughts
What do you think about Facebook’s social experiment? Do you think Facebook should have informed affected users of its experiment with their News Feeds? What responsibility should social media sites have to their users? Do you think your News Feed has been filtered to show more positive posts? Share your thoughts in the Comments section below!
Justin Charleston says
I think this is wrong, especially with the US government trying to get interwoven into FB’s business. Not a good combination.
Carolyn Nicander Mohr says
Hi Justin, while I don’t know any information about the FBI being involved in this experiment, the news of this experiment shows the potential exists for all sorts of manipulation.
Ravi Chahar says
Hi Carolyn,
People are crazy about using such emoticons.
I have noticed most of friends are using it. When they are sad they represent it through these emoticons.
The experiment was good in my opinion and we all know for every stuff there is always some controversy. We don’e need to have an eye.
For social networking sites these emoticons have greater influence as a simple text written. I like to see them and use them.
I hope you have a great weekend.:)
~Ravi
Carolyn Nicander Mohr says
Hi Ravi, Yes, when people use emoticons it’s much easier to tell what their emotions are. When people use humor or are sarcastic we may not realize it without a wink icon. 😉
You bring up an excellent point. How did the experiment analyze the emotions? With emoticons the messages would be easy to read but without emoticons the messages might be misinterpreted if they were read by a software program.
iRewardChart says
Frankly, it doesn’t bother me much. I don’t mean that it shouldn’t ‘Not matter’ tp everyone else. I have most of my posts turned public. Anyone can see it, even Google … and they probably are indexing it too. Facebook is like an open book. If anyone thinks they want to maintain privacy on FB, maybe they shouldn’t be on FB in the first place. Once you put it on FB, thats it, you lose control. So one should be careful enough before posting anything. Then you do not have to worry about these frivolous experiments.
Every company does A/B testing, even we do (as an app company). Its just pure analytics. So its cool.
Carolyn Nicander Mohr says
Hi iRewardChart, Always great to see you back here at The Wonder of Tech!
Good point, there are really multiple issues with the experiment. First, that Facebook posts which were supposed to be “private” among friends were analyzed by researchers. If someone had a very personal post and had their privacy level set to “Only Friends” then they might be upset that researchers were analyzing that post and using it to manipulate their friends’ emotions.
Also, some people are upset that Facebook was filtering posts to manipulate emotions. By showing someone more negative posts from their friends in the experiment, Facebook seems to have evoked more negative emotions. If someone were suffering from depression then having Facebook make them feel more negative as part of an experiment without their knowledge could be considered wrong.
The reactions from Wonder of Tech readers in the poll, in comments and on social media seem to show very mixed feelings among Facebook users.
Lawrence Serewicz says
Carolyn,
Thanks for an interesting article and poll. I have a Facebook account only out of necessity. I rarely use it and keep my information to a minimum. I would quit it but for the fact that people I need to contact through it are on it.
For me what is disturbing is how easily and effortlessly Facebook came to the decision to manipulate its users. The ethics seem to have been designed by lawyers. By that I mean, the first question was “Is it legal” and it was not “Is it ethical”. What this shows is that the utopia of social media and social networks is simply that a utopian vision. I do not wish to rehash any arguments made by critics of the web. instead, I want to suggest that the “experiment” reveals something darker about human nature than we care to consider.
The people conducting the “experiment” did not see the people as themselves as I find it rather odd that anyone would want to be unknowingly manipulated about their emotional state to satisfy a company’s curiosity about its ability to manipulate people. Would the people who run Facebook and designed the “experiment” want that done to them? I would imagine if employees were being treated in this way, they would have a situation on their hands. The incident makes one wonder if that is what Facebook is doing. Is it also experimenting on its employees?
The deeper issue is one of trust. Why should we trust anything that Facebook or its employees have to say when they have demonstrated that they are willing to manipulate its users unconditionally and without concern for their consent?
Many might focus on privacy or better terms and conditions, but while interesting and perhaps important, they miss a fundamental point. The fundamental point is that the process the terms of conditions are designed so that the company can find a way to take advantage of its customers legally. Is this what the world has come to? Is this an ethical way to do business? It may be legal, but is it ethical?
Carolyn Nicander Mohr says
Hi Lawrence, Welcome to The Wonder of Tech! You raise some very interesting points. Facebook didn’t seem to have regard for the emotions of its users, other than how to manipulate them, which has raised concern from many people. The legality of the research is also in question, especially since The Wall Street Journal found that the use of information for research was only added to the Terms and Conditions after the experiment took place. Also see, International Business Times
Facebook Experiment Raises Legal Questions: Could Lawsuit Follow Mood Manipulation Research?.
As we share more of our lives on social media we do have to be cautious about how that information will be used. Very few of us actually read the Terms and Conditions so is including “research” in the lengthy text actually enough to claim “informed consent”?
The good news is that this experiment has ignited a conversation about the ethics of social media sites and how our information will be used. Facebook and most of the university researchers didn’t seem to consider the ethics of the research and I don’t know whether lawyers were consulted or not. But they certainly have caused a firestorm of controversy with this research.
Thanks so much for taking the time to share your insights with us, Lawrence!
Adrienne says
Hey Carolyn,
It’s just one of those things yet again that people have to understand that Facebook is a free service and because of that they make their own rules and do their own thing.
As it is right now, we don’t see all of the stuff our friends posts because of the way they have the newsfeed set up. I mean for those that don’t have a lot of friends I think they should be able to view everything if they want. So it’s a lot, oh well.
As it is right now, I don’t see negative stuff in my feed because I have that filtered out myself. Yep, everything I do not want to see or hear about I have that filtered and I don’t even see the ads.
I’ve never been happy with the way Facebook has done some stuff but as everyone says, it’s their site and they’re going to do whatever they want anyway. If we’re that ticked off about it then leave which we all know people are not going to do. So, oh well.
Thanks for reporting about this and Happy 4th of July!
~Adrienne
Carolyn Nicander Mohr says
Hi Adrienne, Yes, Facebook is a free service and they provide a lot of benefit to us without us having to pay even a dime. That’s great that you have the negative people filtered from your News Feed anyway.
You’re right, there are plenty of times that Facebook does things people don’t like yet there’s never been any one thing that has cause people to leave Facebook in droves.
Anurag says
Hi Carolyn,
Well, I don’t see any problem with this experiment as if I would have a site, I would not tell my visitors what I am going to do with the site. Would you tell?
I mean, if anyone would want to experiment than for the greater chances of it to be successful he or she would not tell the people what’s going on. Would they?
Carolyn Nicander Mohr says
Hi Anurag, Interesting thoughts. You’re right, websites can do a lot of experiments to see which content is most appealing to its users. They could try different formats, fonts, colors, images, topics and see which options drive more traffic. Once they’ve discovered what works best they probably wouldn’t reveal their findings to prevent their competitors learning their secrets to success.
Mike Maynard says
Hi Carolyn,
I finally got here to read this. I find the experiment interesting. I post mostly humorous and positive stuff on Facebook, but there is a lot of negative and even disturbing content, so we need to know the emotional effect. I would like more user control over the news feed, especially corporate content. My friends can be blocked. A lot of friends content gets filtered out now and replaced with business content. We need to choose what we want to see more.
I’m photographing the horses again today. It’s supposed to be a huge event with people coming from all over the country. They wanted the local newspaper – they’re getting me instead! I might submit them to the local paper. I need more practice ready for next week’s carnivals. It’s cloudy, but the light isn’t too bad. I hope the rain holds off. At least in Britain, I’m constantly challenged! I’m getting used to the techniques required for events now. It’s different to landscapes, especially with changing weather. I hope to get some shots today using a 300mm lens so that might be my subject again on Tuesday!
Carolyn Nicander Mohr says
Hi Mike, I’m glad that you are actively filtering your Facebook feeds. Many people don’t do that so they have Facebook do it for them. Having user control over the feeds is important but not everyone will take advantage of that capability.
I imagine the lighting in England is very challenging because it changes so often. I look forward to seeing your pictures tomorrow, Mike!
Jatin says
To be true, I don’t like what Facebook did. As before doing they would have atleast told us that they are going to do so.
But on the other side, I didn’t pay a dime for my FB account and they own it too. Right?
It’s like my decisions are changin every now and then. Need to see a FB expert. 🙂
Carolyn Nicander Mohr says
Hi Jatin, Welcome to The Wonder of Tech! Yes, the reactions are quite mixed, both in the poll results and in the comments. How much should we expect from a free service? What would it take for people to leave Facebook in droves? Certainly more than this, it seems.
Ann Nunziata says
I think the main problem with it is that Facebook saw nothing wrong with intentionally manipulating the emotions of it’s users without their knowledge. It’s a little scary
At the same time, I’m very careful with what I post there, and have been thinking about dropping my account and page there altogether. Now I’m even afraid to like anything since that’s also being “used”.
So far, the most enjoyment I’ve gotten is being kept aware of the activities of a friend in Chicago and sharing bird photos with birdwatchers. Not sure that’s worth the invasion of privacy.
Carolyn Nicander Mohr says
Hi Ann, Good point, if you don’t Like or share anything, then Facebook can’t exploit that information. But then is it even worthwhile to be on Facebook?