Facebook Tests Emotions And You Will Believe What Happened Next

facebook mobile app moments

Before you get out the pitchforks, sit back and think what the Facebook revelation says about us. We are likely to be especially sensitive to what we perceive as an invasion of our privacy. Let’s take a trip down an average rabbit hole of what you consider your online privacy.

If you stripped out the NSA looking at everything, what are you left with? A lot of marketing. Those ads you see? Yeah, they are following you around and based on your search. Email providers scan your emails to serve up advertisements. Your entire life online is one giant A/B split-test to get you to do a certain action.

So, what did Facebook do that has the masses ready to burn the house down? It wanted to see whether or not deliberate changes to the 600,000 users’ news feeds would affect their emotional state.

Not sure a study was necessary to see this. The company wanted to know if a user emotional state would change based on the positivity or negativity they saw. The report was recently published in the Proceedings of The National Academy of Sciences. See, you’re already bored reading that name.

The PNAS report is titled “Experimental evidence of massive-scale emotional contagion through social networks.” Well, it had to be on Facebook, because I don’t think the journal name or title can fit on Twitter.

Facebook’s experiment took place in one week in early 2012. The company altered the number of positive or negative terms a user saw during that period. Out of a 1000 words written, subjects, on average, wrote one fewer emotional words.

Facebook’s reaction to the overwhelming negative reaction has been one of befuddlement so far. Data scientist Adam Kramer tried to explain the experiment. “The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product,” he wrote. “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.”

The question is whether people should have been informed, and that goes to ethics. Enter the the subjective viewpoint of ethics when there is no physical or mental harm that can come of the study. Even the editor of the study is creeped out by the experiment.

Just remember, Facebook does this anyways with its advertising platform. All advertisers manipulate you into a course of action they want you to take.

Should Facebook be manipulating you in their news feeds? No, but a test on how people react to certain content seems within the bounds here. It’s just right on the border.