Facebook Has Always Manipulated Your Emotions

And positive psychologists would say that this is a good thing.

Emotional contagion is the idea that emotions spread throughout networks. If you are around happy people, you are more likely to be happy. If you are around gloomy people, you are likely to be glum.

The data scientists at Facebook set out to learn if text-based, non-verbal/non-face-to-face interactions had similar effects.  They asked: Do emotions remain contagious within digitally mediated settings? They worked to answer this question experimentally by manipulating the emotional tenor of users’ News Feeds, and recording the results.

We come to know ourselves by seeing what we do, and the selves we perform through social media become important mirrors with which we glean personal reflections.

Public reaction was such that many expressed dismay that Facebook would 1) collect their data without asking and 2) manipulate their emotions.

I’m going to leave aside the ethics of Facebook’s data collection. It hits on an important but blurry issue of informed consent in light of Terms of Use agreements, and deserves a post all its own. Instead, I focus on the emotional manipulation, arguing that Facebook was already manipulating your emotions, and likely in ways far more effectual than algorithmically altering the emotional tenor of your News Feed.

First, here is an excerpt from their findings:

In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred.

In brief, Facebook made either negative or positive emotions more prevalent in users’ News Feeds, and measured how this affected users’ emotionally expressive behaviors, as indicated by users’ own posts. In line with Emotional Contagion Theory, and in contrast to “technology disconnects us and makes us sad through comparison” hypotheses, they found that indeed, those exposed to happier content expressed higher rates of positive emotion, while those exposed to sadder content expressed higher rates of negative emotion.

Looking at the data, there are three points of particular interest:

  • When positive posts were reduced in the News Feed, people used 0.01 percent fewer positive words in their own posts, while increasing the number of negative words they used by 0.04 percent.
  • When negative posts were reduced in the News Feed, people used 0.07 percent fewer negative words in their own posts, while increasing the number of positive words by 0.06 percent.
  •  Prior to manipulation, 22.4 percent of posts contained negative words, as compared to 46.8 percent which contained positive words.

Let’s first look at points 1 and 2—the effects of positive and negative content in users’ News Feeds. These effects, though significant and in the predicted direction, are really, really tiny. None of the effects even approach one percent. In fact, the effects are all below 0.1 percent. That’s so little! The authors acknowledge the small effects, but defend them by translating these effects into raw numbers, reflecting “hundreds of thousands” of emotion-laden status updates per day. They don’t, however, acknowledge how their (and I quote) “massive” sample size of 689,003 increases the likelihood of finding significant results.

So what’s up with the tiny effects?

The answer, I argue, is that the structural affordances of Facebook are such users are far more likely to post positive content anyway. For instance, there is no dislike button, and emoticons are the primary means of visually expressing emotion. Concretely, when someone posts something sad, there is no canned way to respond, nor an adequate visual representation. Nobody wants to Like the death of someone’s grandmother, and a frownie-face emoticon seems decidedly out of place.

The emotional tenor of your News Feed is small potatoes compared to the effects of structural affordances. The affordances of Facebook buffer against variations in content. This is clear in point 3 above, in which positive posts far outnumbered negative posts, prior to any manipulation. The very small effects of experimental manipulations indicates that the overall emotional make-up of posts changed little after the study, even when positive content was artificially decreased.

So Facebook was already manipulating your emotions—our emotions—and our logical lines of action. We come to know ourselves by seeing what we do, and the selves we perform through social media become important mirrors with which we glean personal reflections. The affordances of Facebook therefore affect not just emotive expressions, but reflect back to users that they are the kind of people who express positive emotions.

Positive psychologists would say this is good; it’s a way in which Facebook helps its users achieve personal happiness. Critical theorists would disagree, arguing that Facebook’s emotional guidance is a capitalist tool which stifles rightful anger, indignation, and mobilization toward social justice. In any case, Facebook is not, nor ever was, emotionally neutral.

This post originally appeared on Sociological Images, a Pacific Standard partner site, as “Newsflash: Facebook Has Always Manipulated Your Emotions.”

Related Posts