Something doesn’t feel quite right when you learn that Facebook actively manipulates their newsfeed to see how it impacts us emotionally. Yet that’s exactly what the company did in a recent test on over 600,000 users that was just surfaced this weekend. Here’s how the study went down, as summarized by NewScientist:

A face-to-face encounter with someone who is sad or cheerful can leave us feeling the same way. This emotional contagion has been shown to last anywhere from a few seconds to weeks.

A team of researchers, led by Adam Kramer at Facebook in Menlo Park, California, was curious to see if this phenomenon would occur online. To find out, they manipulated which posts showed up on the news feeds of more than 600,000 Facebook users. For one week, some users saw fewer posts with negative emotional words than usual, while others saw fewer posts with positive ones.
It’s well known that Facebook manipulates your news feed based on who and what you interact with on the site. However the fact that the company manipulates the news feed based on your emotional response feels dirty.

The report brings up a question: what if negative content increased overall usage on their site? Negative sensationalism is common place in mainstream media as it’s known to drive eyeballs. Yet media companies typically don’t have the means to measure the emotional impact of their content. Yes, negative content is likely to result in negative emotions, and vice versa. However that’s as much as the media companies can measure (aside from outright polling people).

Facebook, on the other hand, has the means to figure out how users are feeling based on the content that they publish. Given this insight, Facebook could theoretically prioritize negative content if it made you use the site more often. Unfortunately I don’t have the data on whether or not this phenomenon occurs yet it’s clear that Facebook is a medium through which negative and positive emotions can be spread.

Does this mean Facebook should leverage sentiment as a newsfeed signal for better or worse? Did Facebook violate users’ trust by actively manipulating their emotions?