Human Lie Detectors: The Death of the Dead Giveaway

Amateurs and experts alike overestimate their ability to divine truth and deception. But when criminal investigators do it, it can be very bad news for the accused.

We’ve all been lied to, and most of us have a high opinion of our ability to tell a lie from the truth. Yet research repeatedly shows that confidence to be misplaced and that judges, customs inspectors, and yes, detectives, make lousy lie detectors.

Those in law enforcement are trained to “read” body language, affect, facial expressions, mannerisms, and ways of speaking, and to believe that they can trust their gut. They learn that if a suspect averts their gaze, touches their nose, chews a fingernail, strokes the back of their head, slouches or fidgets, they are likely lying and thus, guilty.

Virtually all scientific research finds
this mindset is counterproductive and even lowers the accuracy of judgments. People under stress—being wrongly accused certainly qualifies—can behave in ways impossible to distinguish from those who are lying. Yet the accused may be convicted in the court of public opinion—or worse—in large part because they don’t react to tragedy or the loss of a loved one as others want them to or expect.

Righting What’s Wrong in Criminal Justice

Wrongful convictions stem from the belated entrance of scientific rigor into the field of forensics, systemic problems, and the ubiquitous ‘human factor.’ In the coming weeks, a series of stories by crime author Sue Russell looks at why convictions go wrong, at the common reluctance to rectify error, and at innovations to better safeguard justice.

Stories so far:

A Porn Stash and a False Confession: How to Ruin Someone’s Life in the American Justice System

Red Flags: Early Warnings of Wrongful Convictions

Human Lie Detectors: The Death of the Dead Giveaway

Why Fingerprints Aren’t Proof

Litigating Lineups: Why the American Justice System Is Keeping a Close Eye on Witness Identification

The Right and Privilege of Post-Conviction DNA Testing

Seeking Second Chances Without DNA

Why Can’t Law Enforcement Admit They Blow It Sometimes?

A Prescription for Criminal Justice Errors

Marty Tankleff was 17 in 1988 when he awakened one horrific day to find his parents bludgeoned and stabbed at their Suffolk County, New York home. His mother Arlene was dead, his father Seymour clinging to life. Marty called 9-1-1 and rendered first aid.

Marty and family members promptly told detectives that Seymour Tankleff’s business associate, bagel store owner Jerry Steuerman, had borrowed half a million dollars from Tankleff and threatened him when Tankleff demanded he begin paying him back; Steuerman had gambling debts and police found a promissory note showing that $50,000 of his very high interest loan from Tankleff was due that week. Steuerman had left the Tankleff home at 3 a.m. that morning after attending a weekly poker gathering at which the two men barely were speaking. The attack took place hours later.

The New York Times reported that a week later, Steuerman flew to California leaving behind notes to fake his own death. He changed his toupee and traveled under the alias Jay Winston. Police quickly tracked him down. Despite all this, detective James McCready did not consider Steuerman a suspect.

McCready had immediately focused on Marty as responsible, citing greed as the boy’s motive. (Marty was not due to inherit until he was 25: something McCready later conceded he did not know.) It was Marty’s muted, expressionless demeanor that had his attention.

“He was sitting as calm as could be, with his hands clasped,” McCready told CBS. “I think he would have been shaken, been very upset.” The detective was sure Marty was lying: “I get a feeling, it’s not so much what is said. It’s the way in which it is said.”

Marty’s interrogation was not recorded. Only McCready’s partial handwritten account remained. McCready also tricked Marty—an acceptable and legal tactic in the U.S. He told him his hair was found in his mother’s hand. He also told the boy that his father (who died weeks later) regained consciousness long enough to say Marty was the attacker. Marty was stunned. His father never lied to him. So, he began considering awful explanations. “Could I have blacked out?” he asked detectives. “Could I be possessed?” Within hours he confessed.

Physical evidence didn’t incriminate Marty. But his confession, which he quickly recanted and refused to sign, proved impossible for jurors to ignore. At age 19, he was convicted and sentenced to 50 years to life in prison.

In 2003, a family-hired private investigator, Jay Saltpeter, found witnesses who said Steuerman hired two hit men to kill the Tankleffs; Marty’s appellate attorney laid out evidence indicating that others committed the murders. Prosecutors considered the new witnesses and their stories shaky. All parties denied involvement and Steuerman has never been formally named a suspect in the case. Nonetheless, in December 2007, Tankleff’s conviction was overturned by a state appellate court.

A new trial was ordered, but State Attorney General Andrew Cuomo’s office stated that the new evidence likely would lead a jury to find him not guilty and that they would not retry him. Free at last, he was 36. Marty Tankleff is now studying for a law degree and wants to become a defense lawyer.

Much was made of his demeanor and apparent lack of grief. But too much grief may also suggest guilt. Detectives in the case of Jeffrey Deskovic—freed in 2006 after 16 years in prison after DNA tests showed he hadn’t raped and murdered his high school classmate in Peekskill, New York—believed he was lying because he was too distraught. He also falsely confessed.

So-called inappropriate emotional responses from someone who has learned of a loved one’s death don’t carry great weight with former Florida homicide detective David Taylor, a veteran law enforcement trainer based in West Virginia.

“Everyone responds to traumatic situations completely different,” he says. “Giving death notifications, some people will ball up in a corner and cry their guts out. Some will sit there in complete disbelief, or become argumentative. How would you be, accused of a crime? And how the person accuses you is going to impact your reaction.

“We’re humans. Humans make mistakes. I think when you interview hundreds, then thousands, of people you learn to pick up some traits that are indicative of deception. But you also have to realize that could be something unique to a particular person that may not be truly suggestive of deception, and how do you ferret that out?”

Being able to accurately detect lies and truth is a goal with major implications for, say, homeland security and interrogating terrorism suspects. There are many body language experts. Paul Ekman’s work with a Facial Action Coding System, which categorizes more than 10,000 subtle facial expressions or micro-expressions, gained attention when the lead character in the TV series Lie to Me was based on Ekman. His system, taught to many in law enforcement, claims an accuracy rate of more than 95 percent. But there simply is no ironclad way to be sure that someone is lying. (One researcher criticizes the TV show, saying it increases viewers’ skepticism but not their lie-detecting accuracy.)

Yet Steven Drizin, clinical professor at Northwestern University of Law, and cofounder of its Center on Wrongful Convictions of Youth, says that many companies that offer interrogation training continue to put great weight on behavioral analysis, over-promising officers that they will be able to detect deception with greater accuracy.

“There’s really no way to tell whether or not someone is reacting because they are under extreme stress or because they’re lying,” says Drizin. “And even if you think that they’re lying to you about one thing, it doesn’t tell you whether they’re lying to you about the issue that you’re most interested in: the murder or the other crimes.”

Bottom line, generalizing is a recipe for trouble.

One so-called classic sign of deception—whether someone looks you in the eye—is a good example of a potentially dangerous belief.

“People from certain cultures would never look you in the eye,” notes Drizin. “It’s impolite. And other people would be extremely uncomfortable if you invaded their personal space, which is a classic tactic in police interrogation. Even touching somebody—whether it’s a gentle touch or bumping knees together. People from different cultures are going to react differently to those kinds of tactics. And what may look like lying may be nothing more than anxiety.”

Other behaviors read as deceptive can be normal adolescent behavior. “Slumping in one’s chair, not giving eye contact,” he says. “Or shaking one’s leg, or looking to get over a conversation quickly. So these kinds of behavioral cues don’t work in large part for large segments of the juvenile population.”

Kids are more impulsive, gullible, and trusting, and don’t focus as much on long-term consequences. And trainers in interrogation techniques don’t distinguish, says Drizin, “between their system for detecting deception in adults versus juveniles. So it’s fraught with error.”

Social psychologists Bella DePaulo and Charles F. Bond Jr, who have studied deceiving and detecting deceit for decades, are also skeptical of the notion that skilled, experienced investigators possess superior deception-detection abilities.

DePaulo, a visiting professor of psychology at the University of California, Santa Barbara, and Bond, of Texas Christian University, are coauthors of the book, Is Anyone Really Any Good at Detecting Lies?, a collection of their professional papers. Their review of more than 100 related studies found little to support the notion that good detectives have anything akin to a sixth sense.

“We found that on average, people are only slightly better than chance,” says DePaulo. “In many studies, a chance level of accuracy is 50 percent—people trying to differentiate lies from truths are shown videotapes in which half of the people are lying and half are telling the truth. Overall, they get about 54 percent correct. That’s better than chance, but not by much.”

Experienced detectors of lies fared no better, says DePaulo, “though sometimes they were more confident.”

In one study, participants ranging from federal law enforcement officers—both new recruits and “advanced” officers—and undergraduates with no special experience or training heard 64 different audio clips of people lying and telling the truth.

When it came to telling the liars from the honest, says De Paulo, “no one got any better from the first half of the test—the first 32 messages—to the second half. But the advanced officers—and only the advanced officers—became increasingly confident, even though they were not getting any better at separating the liars from the truth-tellers.”

DePaulo believes that experienced detectives develop theories about cues and behaviors that they think important, then start seeing evidence of those theories. “What they are not noticing,” she says, “are the times when the evidence does not support their theories.”

Cognitive neuroscientist Itiel Dror of the University College of London Institute of Cognitive Neuroscience describes “the expert effect” where, after acquiring a certain level of expertise, “you become less, not more, effective than the average person, likely because of overconfidence or overblown belief in yourself.”

Social psychologist Carol Tavris believes this is what is at work in criminal profilers who can send an investigation off course because they often diagnose criminals long-distance: “They all shoot from the hip, and they all are tremendously arrogant about their ability to diagnose somebody. They’re more like psycho-cowboys, or something. They shoot first and ask questions later.”

Tavris considers the ‘humans as lie detectors’ concept perhaps the most dangerous in professional training.

“If you were going to have one new element in the training of police and detectives and judges,” she says, “it would be humility control…an understanding of why we are not great judges of who’s lying and who’s telling the truth.” Professionalism must be tempered with: What happens if we’re wrong? “And that’s not built into the current system.”

“In the adversarial system it’s hard to be humble, because if you are humble, you are opening up the possibility for attack,” notes professor David Faigman of the University of California’s Hastings College of Law in San Francisco. “And the system really encourages selection of experts who are polarized, because you want people who are strong for your side. And if you get somebody who is reasonable, then they look weak and…they’re not going to be very effective.”

One day, scanning brains with functional magnetic resonance imaging units may be used to detect a suspect’s lies. While some researchers caution that that day has yet to arrive, No Lie MRI in California has claimed 90 percent accuracy. Human efforts don’t come close. Even so, 90 percent in a criminal trial is far from proof positive. And fMRI, like the polygraph, may be vulnerable to test-beating strategies or different responses in those with psychopathologies.

DePaulo isn’t ready to “rule out gut feelings totally.” Preliminary research into what she calls “indirect deception detection” suggests that accuracy sometimes improves if people are asked if they feel suspicious or if they got enough information, rather than being asked directly if they think someone lied.

But for the foreseeable future, a little skepticism about human lie detectors is in order.

Related Posts