Skip to main content

Your Unconscious Mind Is Better Than You Are at Detecting Lies

New research suggests that if we could push aside the biases of our conscious minds, we’d be better at spotting dishonesty.
  • Author:
  • Updated:
(Photo: jesadaphorn/Shutterstock)

(Photo: jesadaphorn/Shutterstock)

Can you tell if someone is lying to you? Newly published research suggests you actually have that ability—at least to an extent—but accessing it is a different story.

In two experiments, researchers from the University of California-Berkeley found people are better at detecting deception using indirect methods that tap into their unconscious minds. They conclude our conscious minds, hobbled by commonly held misbeliefs, tend to trip us up.

“These results provide strong evidence for the idea that although humans cannot consciously discriminate liars from truth-tellers, they do have a sense, on some less-conscious level, of when someone is lying,” Leanne ten Brinke, Dayna Stimson, and Dana Carney write in the journal Psychological Science.

"It appears that viewing a liar automatically activates concepts associated with deception, and viewing a truth-teller automatically activates concepts associated with truth."

The researchers begin by noting that previous studies have consistently found “human judgments of veracity to be no more accurate than the flip of a coin.” Yet they point out that other research has found primates such as monkeys and chimpanzees are capable of spotting dishonest behavior. It makes no evolutionary sense that this valuable skill would skip our species.

And in fact, the researchers argue, it did not. The problem is we confuse ourselves with cliched notions of which non-verbal cues point to deception. “For example,” they write, “the commonly held belief that liars avert their gaze and fidget is false.”

To test their hypothesis, the researchers conducted a couple of experiments in which participants’ conscious and unconscious minds effectively competed to find who could best ferret out deception. They watched 90-second “interrogation videos” featuring 12 people accused of stealing $100 from the testing room. Half had actually taken the money; the others were unfairly accused.

The 72 participants (all college undergraduates) watched as the suspects answered both neutral questions (“What is the weather like outside?”) and direct ones (“Did you steal the money?”).

The students then expressed their opinion on whether each was telling the truth. They proved quite inept, successfully picking out liars less than 44 percent of the time.

Finally, they completed a version of the Implicit Association Test, which is designed to measure the automatic, unconscious associations we make between people, objects, and ideas. In this case, “we were interested in whether observing someone tell a lie would, outside of awareness, activate mental concepts associated with deception,” the researchers write.

They found that, in a word game, participants were quicker to accurately categorize terms such as “dishonest” and “deceitful” when the photo and name of one of the actual thieves was visible on their screen.

“It appears that viewing a liar automatically activates concepts associated with deception, and viewing a truth-teller automatically activates concepts associated with truth,” the researchers conclude.

“Female participants achieved significantly greater indirect accuracy than male participants,” they add in an interesting aside. “This gender difference is consistent with previous findings that women’s person-perception accuracy is greater than men’s.”

A second test, which used a different technique, came to the same conclusion. It found that “subliminally presented faces of liars and truth-tellers activated and facilitated congruent concepts,” meaning that, once again, “automatic associations were significantly more accurate than controlled, deliberate decisions.”

It all suggests that “accurate lie detection is, indeed, a capacity of the human mind,” the researchers conclude. The problem is our “accurate unconscious assessments” get overridden by our biases and misconceptions. Sad, but true.