Red ‘Facts,’ Blue ‘Facts’: The Psychology of Truthiness

New research finds our moral convictions strongly influence what facts we choose to believe regarding a given issue.

Sen. Daniel Patrick Moynihan’s famous assertion that “You are entitled to your own opinion, but not your own facts” seems increasingly quaint today, at a time when one person’s self-evident truth is dismissed by another as fabrication or myth.

New research provides a sobering reason why we can’t even agree on what we’re arguing about: We align our perception of reality to comfortably coexist with our moral convictions.

“In the realm of moral reasoning, at least, a clean separation of opinion and fact may be difficult to achieve,” write psychologists Brittany Liu and Peter Ditto of the University of California, Irvine. “Our data consistently show that evaluations of an act’s inherent morality are strongly associated with factual beliefs about both its positive and negative consequences.”

The researchers illustrate this point with three experiments, which they describe in the journal Social Psychological and Personality Science.

One study featured 1,567 people who visited psychologist Jonathan Haidt’s website yourmorals.org. Participants were presented with four moral issues in random order—two generally seen as acceptable by conservatives (capital punishment, and forceful interrogation of suspected terrorists), and two usually seen as acceptable by liberals (embryonic stem cell research, and promoting condom use as part of sex education).

For each issue, participants rated, on a one-to–seven scale, the extent to which they considered the practice morally wrong, and considered it wrong even if it was proven to effectively achieve its aims. Then they answered a series of questions regarding the likely costs and benefits of each course of action.

The results: “The more participants believed that the action was immoral even if it had beneficial consequences, the less they believed it would actually produce those consequences, and the more they believed it would have undesirable costs.”

For example, the researchers write, “The more participants endorsed the belief that condom education was morally wrong even if it prevented pregnancy and STDs, the less they believed that condoms were effective at preventing these problems, and the more they believed that promoting condom use encouraged teenagers to have sex.”

Now, some people do argue that an action such as waterboarding is wrong and should be outlawed, even if national security suffers. But the researchers say that such highly principled stances are rare, largely because they conflict with an inherent tendency to favor practices that offer significant benefits at low (personal) costs.

“Our research suggests that people resolve such dilemmas by bringing cost-benefit beliefs into line with moral evaluations,” they conclude, “such that the right course of action morally becomes the right course of action practically as well.”

Liu and Ditto concede this does not bode well for political compromise, at least on morally charged issues. They call it “particularly disheartening” that people with “strong moral convictions and high opinions of how informed they are”—which is to say, members of the political and media elite—are more likely to cite “facts” to support their moral positions. In this way, misinformation gets spread widely, and the actual truth becomes less and less clear.

These findings help us understand how certain anti-abortion absolutists can believe that women can’t get pregnant via rape. If you consider both abortion and forcing a woman to carry a rapist’s child to be morally unacceptable, the easiest way out of that dilemma may be to will yourself to believe the latter is impossible.

If Liu and Ditto are right, this is merely an extreme example of a depressingly common phenomenon. We may not be entitled to our own facts, but that doesn’t stop us from claiming them.

Related Posts