Artificial Intelligence may detect early Alzheimer's in speech patterns
Advanced AI analysis may be able to spot subtle features in a person's speech
ISLAMABAD, (ONLINE) - Without AI, trying to analyze a person’s speech for clues about cognitive decline was time-consuming and hard to parse, according to one study author.
Advanced AI analysis may be able to spot subtle features in a person’s speech that could help doctors detect Alzheimer’s disease years before symptoms of mental decline appear.
That’s according to recent research published in the Alzheimer’s Association journal Diagnosis, Assessment and Disease Monitoring, which found that natural language processing (NLP), speech recognition, and machine learning could be used to identify difficult-to-measure voice changes that correspond with specific biomarkers linked to Alzheimer’s disease, including levels of the protein amyloid-beta, which scientists have likened to “seeds” that may later grow to become Alzheimer’s.
“Prior to the development of machine learning and NLP, the detailed study of speech patterns in patients was extremely labor intensive and often not successful, because the changes in the early stages [of Alzheimer’s] are frequently undetectable to the human ear,” said lead study author Ihab Hajjar, MD, a professor of neurology at UT Southwestern’s Peter O’Donnell Jr. Brain Institute, in a news release.
“This novel method of testing performed well in detecting those with mild cognitive impairment and more specifically in identifying patients with evidence of Alzheimer’s disease — even when it cannot be easily detected using standard cognitive assessments.”
Dr. Hajjar and his collaborators collected data on 206 people aged 50 and older, 114 who met the criteria for mild cognitive decline and 92 who were cognitively unimpaired. Each person’s cognitive status was determined through standard testing.
Study subjects were also recorded as they gave a one- to two-minute description of a colorful circus procession.
Subtle speech features may signal Alzheimer risk
Using sophisticated computer analysis of these recordings, scientists could determine and evaluate specific types of speech features, including:
• how fast a person talks
• pitch
• voicing of vowel and consonant sounds
• grammatical complexity
• speech motor control
• idea density
The research team also examined cerebral spinal fluid samples for amyloid beta protein. One form called amyloid beta peptide 42, for example, is especially toxic, according to the National Institute on Aging, and plays a significant role in Alzheimer’s disease. A total of 40 cognitively unimpaired and 63 impaired individuals were found to be amyloid-positive.
Seven tips for communicating with someone who has Alzheimer’s
In addition, scientists reviewed MRI brain scans, which can provide measures of hippocampus volume. In the early stages of AD, the hippocampus — which plays a significant role in memory and learning — shows rapid loss of its tissue, according to research.
By comparing all these tests, the study authors discovered that specific speech characteristics, including loudness, corresponded with the presence of amyloids and hippocampus size.
Melissa Lee, PhD, a program officer for the Alzheimer’s Drug Discovery Foundation’s program for biomarker development, notes that the researchers were able to identify specific speech patterns in both the impaired and unimpaired individuals who had biological indicators of Alzheimer’s.
“It’s interesting and really exciting that the researchers could see the speech patterns in the cognitively unimpaired group with amyloid beta present, because individuals who have amyloid beta protein present are the ones that are more likely to go on and develop this disease,” says Dr. Lee.