By Sahana Ghosh
Kolkata, May 10 (IANS) Are you sinking into depression? Let your speech tell you.
Analysing variations in emotions in your speech could be a pointer to neurological impairments such as Alzheimer's disease (AD) and thanks to an android app being developed by Indian researchers, speech patterns can be constantly monitored and changes detected.
"There is a stark contrast between a normal person's speech and one who may have suicidal tendencies, Alzheimer's symptoms or Parkinson's disease. When a person speaks into the app based on speech emotion recognition, it stores the normal voice patterns and detects if there is any deviation over time, Susmita Bhaduri of Kolkata's Deepa Ghosh Research Foundation told IANS.
"This is a non-invasive method for early detection of AD," Bhaduri added.
Almost 47 million people are living with dementia around the world with 4.1 million of them in India, according to the World Alzheimer Report 2015.
In the early stages of AD, the patient suffers from intermittent memory deterioration leading to lack of cognitive and perceptual ability in speech, language and construction of sentences et al, the researcher said.
"In the early stages there are mild memory losses, patients and their relatives are not able to relate the symptoms with AD and they tend to relate the cognitive changes to age. It usually takes two to three years after onset of symptoms to start medication. Our model could be a routine check-up mechanism," Bhaduri explained.
Apart from Bhaduri, the study, published in the Journal of Neurology and Neuroscience in March, is co-authored by Dipak Ghosh, who is affiliated to the C.V. Raman Centre for Physics and Music, Jadavpur University as well as the varsity's Rajdeep Das.
The published data indicates the model's applicability in diagnosis and prognosis of other cognitive diseases, various types of mental depressions and even for the assessment of suicidal tendencies of severely depressed patients.
"The android application is developed as proprietary software at our foundation. We are in the process of validating it with relevant data, in collaboration with entities like (Kolkata's) Institute of Psychiatry," Bhaduri added.
Interfacing at the human-machine interaction, the basis of the technique lies in the field of speech emotion recognition or the study of emotional content of speech signals.
"Our work involves monitoring specific parameters of speech signals spoken out of two elementary emotions, anger and sadness. We analysed 1,200 samples of speech with the tool, showing clear differences between different emotions," Bhaduri explained.
Research is "desperately needed" given the gravity of the situation surrounding Alzheimer's, Amit Dias, an epidemiologist and geriatrician told IANS, conceding diagnostics have a very important role to play.
"We are open to possibilities but we need to try it out and gather evidence for any new technique on the table," Dias, of the Department of Preventive and Social Medicine, Goa Medical College, told IANS.
(Sahana Ghosh can be contacted at sahana.g@ians.in)
This website uses cookies.