Deep down, everyone just wants to be understood. And regardless of what language we speak, or whether we are blind or sighted, new research hints at a shared, universal non-verbal communication system that comes to life when we gesture without talking.

The study of children aged 3 to 12 years old was led by Şeyda Özçalışkan, a psychological scientist at Georgia State University in Atlanta and native Turkish speaker who studies language development in different types of learners and different language speakers.

By studying gestures in adults and children, Özçalışkan's research tries to understand how language affects the way people think, the way they construct and express ideas, which can be seen in gestures when they're not speaking out loud, and when they are.

Gestures are just one kind of non-verbal communication: wordless signals that also include body language, posture, eye contact, and facial expressions, which may indicate how someone is feeling.

Gestures, though, could perhaps reveal insights into how children formulate and express ideas – their cognitive abilities – as they develop, so Özçalışkan thinks.

In this latest study, 100 children were asked first to describe an action with words and hand movements, and then to describe the same action without speaking using only their hands (what's known as silent gesture).

Half the children were native English speakers, and the other 50 kids spoke Turkish as their first language. English and Turkish make for a good comparison because they differ in terms of how speakers of each language describe events.

"If you're speaking Turkish, if you want to describe someone running into a house, you have to chunk it up. You say, 'he's running and then he enters the house,'" explains Özçalışkan.

"But if it's in English, they'll just say 'he ran into the house,' all in one compact sentence."

"We wanted to find out whether gesture does or does not follow these [linguistic] differences and how early do children learn these patterns," she adds.

When children spoke and gestured at the same time, their gestures followed the conventions of their native language: Turkish-speaking children ordered their gestures the same as they would a sentence, and English-speaking children smooshed theirs into one movement.

It makes sense that the sequence of their gestures would mirror the order of their words; the kids were acting out scenes while narrating them.

These language-specific patterns showed up in children as young as 3 and 4 years old, which suggests language can influence nonverbal representations of events at an early age.

However, when describing the same scenes without speaking, the sequences of the children's hand gestures were remarkably similar. The language-specific differences in gestures had seemingly evaporated.

Özçalışkan and colleagues also found similarly in earlier work with adults: blind English and Turkish speakers organized their gestures the same as sighted speakers did when they refrained from speaking.

Past studies in German and English-speaking children have also found silent gestures don't necessarily follow the structure of a person's native language, however those studies didn't directly compare different language speakers like this new one did.

Özçalışkan and colleagues suggest their findings, although tentative, hint at the possibility that we all share some rudimentary non-verbal communication system that gets overridden or altered once we start learning language.

Of course, this research involved researchers interpreting abstract gestures from a few hundred toddlers, pre-teens, and adults – hardly enough data to support such a big claim, but certainly an intriguing idea to explore.

The study has been published in Language and Cognition.