To spot a liar, ignore everything except the level of detail in a person's story, new research suggests.

If a person provides rich descriptions of who, what, when, how, and why, it's likely they are telling the truth. If they skim over these details, they are probably lying.

Using this very simple test and nothing else, people can separate truth from lies with nearly 80 percent accuracy, researchers from the University of Amsterdam have found.

When it comes to catching liars, we usually try to employ as many tell-tale signs as possible into our assessment. Do they look shifty? Are they anxious? Why are they fidgeting?

After the 9/11 terror attacks, for instance, US airport security personnel were trained to look for 92 behavioral clues that a person was being deceptive. Polygraphs, commonly called lie detectors, combine different physiological inputs such as blood pressure, heart rate, and breathing rate to detect possible lies.

However, research shows that even trained professionals do little better than random chance when attempting to separate truth from falsehoods.

Part of the problem is that integrating lots of conflicting data points on the fly and converting that to a binary decision about whether someone is telling a lie is extremely difficult.

"It's an impossible task," says Bruno Verschuere, a forensic psychologist and lead author on the study.

"People can't assess all those signals in a short time, let alone integrate multiple signals into an accurate and truthful judgment."

Another problem is that people have stereotypes about what innocent and guilty people look like, which are not very predictive of truth-telling or lying.

To overcome these issues, Verschuere and his colleagues at the University of Amsterdam decided to try a "radical alternative". They instructed study participants to focus on only one clue – the level of detail in a person's story – and ignore everything else.

"We reasoned the truth may be found in simplicity and we propose to drop rather than add cues when trying to detect deception," the researchers say.

In a series of nine studies, 1,445 people were instructed to guess whether handwritten statements, video transcripts, video interviews, or live interviews about a student's activities on campus were true or false.

These accounts came from students who had either enacted a mock theft of an exam for a locker and lied about it, or had innocently wandered around campus and told the truth about their activities.

The study participants who relied on intuition to detect lies, or who used many factors to make the decision, performed no better than random chance.

But those who were instructed to focus solely on the level of detail in the accounts could accurately separate truth and lies with 59–79 percent accuracy.

These participants were told to examine the "degree to which the message includes details such as descriptions of people, places, actions, objects, events, and the timing of events," and "the degree to which the message seemed complete, concrete, striking, or rich in details".

"Our data show that relying on one good cue can be more beneficial than using many cues," the researchers say.

The researchers' 'use-the-best (and ignore-the-rest)' rule of thumb was a superior lie detection method, regardless of whether the participants knew that the purpose of the activity was to detect lies or not.

This suggests that pre-existing stereotypes about guilt and innocence were not getting in the way of using the level of detail as a lie detection tool.

In high stake situations, it's likely that people will enrich lies with details to increase their credibility, so it's possible that lie detection rules of thumb will be context-dependent, the researchers say.

However, using more and more cues – or even big data and machine learning – isn't necessarily going to improve accuracy in lie detection, they argue.

In a previous study using 11 different criteria to detect lies, people correctly rated the level of detail but the other, useless information clouded their overall judgment.

"One counterintuitive way of dealing with an information overload is to simply ignore most of the available information… Sometimes, less is more," the researchers say.

This paper was published in Nature Human Behaviour.