Facebook’s Emotional Manipulation

Facebook’s Emotional Manipulation

Facebook has recently angered many with its disclosure of a “secret” emotional study on thousands of its users.  In January 2012, for the span of one week, Facebook subjected 700,000 random users – without their knowledge – to emotional manipulation to see if they could get a better understanding of their user base.  In fact, what the company was doing was studying the effects of emotional contagion.  The problem is, until recently, Facebook did not disclose that the study had even been undertaken.  The execs at the world’s largest social media site were looking to see if, by manipulating the News Feed algorithm, they could manipulate their users’ statuses into more negative or more positive feelings.

The results for Facebook did not win the company any fans.  While there are no arguments that the study was legal – it was – there is some question about the ethical bent of the study.  The study is, in fact, quite legal, as it is buried in the fine print of the Data Use Policy that studies such as this could occur.


It does, however, cause some question about why the social media juggernaut decided to conduct the experiment and also why it decided to finally reveal the experiment over two years after the fact.  The study was legal, and in fact, there is no real difference between what the Facebook staffers did with this experiment and what they do on a daily basis in tweaking the look of individual News Feeds and so forth.

However, there is some argument that Facebook did bend the rules of ethical research a bit too far, and in doing so, the company may well have caused some human rights issues at the very least.  Sure, the social media company did learn that they could influence whether or not they could influence their users, but because they did not let the users know until just last week that this randomized experiment was conducted, there are several thousand Facebook users that are ticked off that they were so used, and rightfully so.

FacebookThe big problem is, in conveying the results of their findings, Facebook has not only shown that yes, they can manipulate its users into feeling one way or the other, but the experiment has had the unexpected and unwanted side effect of making Facebook users as a whole more mistrustful.  We currently live in a world where information is shared rather freely online.  Why, then, should we not trust?  The second we give anyone access to our email address, they can look us up and find out anything about us, should they have the wherewithal to do so.  However, it’s important to acknowledge that because our data use information was examined without consent, Facebook now has a full user base of individuals who may not be too happy about sharing their information.

While the news that statuses can be manipulated to reflect one feeling or the other may come as welcome to businesses looking to figure out how to generate positive responses to online product, it is important to acknowledge that the news that Facebook conducted this experiment will leave potential consumers wary about what sort of information, precisely, is out there about them, which may, in turn, drive consumers away from businesses.  That makes for some truly scary times.

  • Robert P Gbeanquoi

    Thankful to God this Sunday that on Facebook using my phone