Scientists in the US have created a tool for identifying signs of depression through speech patterns, and they say it could help diagnose the condition more accurately.

The system works by using a machine learning algorithm to find vowel sounds associated with depression, and it's designed to work alongside doctors as they assess patients, not replace the human element completely.

Previous studies have identified that depression can change the way we talk: our speech becomes flatter and more monotone, and we start leaving longer pauses.

With this in mind, researchers from the University of Southern California (USC) developed their tool – called SimSensei – which runs this intelligence against vowel sounds and performs a frequency analysis on them, to spot signs of depression and post-traumatic stress disorder.

As Michael Byrne explains at Motherboard, SimSensei uses an algorithm developed in 1967 called the k-means algorithm, which can put large data sets into clusters based on average values, which can then be compared against 'normal' speech patterns.

In a new study, the researchers ran their algorithm on 253 volunteers, who were also asked to fill out a self-assessment questionnaire.

"The experiments show a significantly reduced vowel space in subjects that scored positively on the questionnaires," the authors report. "These findings could potentially support treatment of affective disorders, like depression and PTSD in the future."

Don't underestimate the difficulty of diagnosing depression, either. Not only does it come in many different forms with many different outward signs, but it's also very difficult to quantify and monitor.

A study carried out back in 2009 revealed that only about half of patients with depression were correctly diagnosed by their doctors. That's not a great figure, but it might be understandable, considering how hard depression can be to spot – and the workload non-specialist general practitioners have to get through.

Because of this, having a digital assistant on hand could be hugely helpful for doctors who would otherwise be relying on their own observations and the opinions of the patients themselves – opinions which may not always be totally reliable.

It's just one of many ways that SimSensei may prove useful. The system has also been employed as part of a job interview training program designed to get veterans prepared for life away from the army, by helping to analyse their speech, mannerisms, and conversational skills in virtual interview sessions.

Looking ahead, the SimSensei team says it wants to use the algorithm to see if disorders such as schizophrenia and Parkinson's can be diagnosed as well – so it's possible AI-assisted diagnoses could become an important part of human health services in the future.

The study is published in IEEE Transactions on Affective Computing.