As hate speech continues to rise—and not just in the United States—social media platforms like Facebook and Twitter are struggling to deal with the problem. New research suggests it’s vital that they figure it out.
A study from Poland finds exposure to angry, hostile content targeting specific groups indirectly increases people’s prejudices.
“When being frequently exposed to hateful online commentaries, people become increasingly desensitized to them,” writes a research team led by University of Warsaw psychologist Wiktor Soral. “Ultimately, the contents of these commentaries come to shape their perceptions of [perceived outsiders such as] minorities, immigrant groups, and political adversaries.”
“After passing a certain tipping point,” messages preaching hate no longer provoke a strong emotional response, the researchers explain. As a result, this type of content is seen as “less negative and harmful, less important, and less violating of social norms.” This leads to decreased sympathy for the targets, and, eventually, increased prejudice against them.
In the journal Aggressive Behavior, Soral and his colleagues demonstrate this damaging dynamic via two large surveys and a laboratory study. The first survey featured 1,007 Polish adults, who indicated how often they encounter hate speech aimed at Muslims or gays and lesbians.
They then read six angry, derogatory assertions—three aimed at each group—and indicated how offensive they found them, using a one-to-seven scale. The statements were not subtle: They included “Muslims are stinky cowards who can only murder women, children, and innocent people” and “I am disgusted by fags.”
Finally, to measure participants’ level of prejudice, they completed a classic “social distance scale,” in which they expressed the extent to which they would accept members of those minority groups “as a co-worker, as a neighbor, or part of their family.”
“People who frequently encounter examples of hate speech are less inclined to perceive hate speech as an offensive and abusive phenomenon,” the researchers report. “This desensitization to the harmfulness of hate speech was in turn a risk factor of greater prejudice [toward minority groups].”
The second survey, which featured 682 Poles between the ages of 16 and 18, was similarly structured, except the minority group they evaluated was refugees. Participants exposed to more hateful comments about refugees were not only more prejudiced against them: They also expressed “greater support for radical, anti-immigrant government policies.”
This sort of survey raises a chicken-and-egg problem, in that people who are already prejudiced may be exposed to more hateful messages, thanks to their chosen circle of social media contacts. To get a better idea of causation, the researchers conducted the lab study, which featured 75 Polish university undergraduates.
They were told their task was “to read through five web pages from discussion forums and to assess the aesthetics and readability of the page design.” For half the participants, one of the five comments on each page featured hate speech aimed at a specific minority populations.
After, all read 15 statements that disparaged a variety of groups, and rated how offensive they found them using a one-to-seven scale. Finally, they filled out the aforementioned social distance scale.
Those who had read the offensive comments “perceived examples of hate speech as being significantly less offensive” than those who did not, the researchers report. What’s more, they also “exhibited significantly higher levels of prejudice” than their peers.
The results bring to mind a word that has become quite familiar this year: “normalize.” The more you are exposed to hate speech, the more “normal” it becomes, and the resultant lack of a strong emotional reaction blinds us to “the harm done by the verbal violence,” the researchers conclude.
So as hate speech proliferates on social media, it has negative, coarsening effects on society. Facebook was supposed to link us together; it may, in fact, by driving us further apart.