In everyday life, many of us experience lethology without hardly realizing it. What is it about? Well, simply due to the difficulty in finding the right word, which usually worsens with age. Although this difficulty may indicate changes in the brain consistent with the early (“preclinical”) phases of Alzheimer’s disease, a recent study from the University of Toronto (Canada) suggests that it is fundamentally the speed of speech that serves as a indicator of brain health in older adults.
To reach that conclusion, the researchers asked 125 healthy adults, ages 18 to 90, to describe a scene in detail. Recordings of these descriptions were later analyzed using artificial intelligence (AI) software that extracted features such as speech rate, length of pauses between words, and the variety of words used.
Participants also completed a series of standard tests that measured concentration, thinking speed, and the ability to plan and carry out tasks. The age-related decline in these “executive” abilities was closely linked to the pace of a person’s everyday speech, suggesting a broader decline than just difficulty finding the right word.
A novel aspect of this study was the use of a “picture-word interference task,” a task designed to separate the two steps of naming an object: finding the correct word and sending the command to the mouth to say it out loud. During this task, participants were shown images of everyday objects (such as a broom) while an audio clip of a word related in meaning (such as mop) or similar in sound (such as at) was played.
Interestingly, the researchers found that older adults’ natural speech speed was related to their speed in naming pictures. This shows that the cognitive and linguistic changes that occur with age could be due to a general slowdown in processing, rather than a specific problem in retrieving words from memory.
Although the results of this study are interesting, word searching in response to image-based cues may not reflect the complexity of vocabulary in unconstrained everyday conversation. Verbal fluency tasks, which require participants to generate as many words as possible from a given category (for example, animals or fruits), or to start with a specific letter within a time limit, can also be used to analyze better the “tip of the tongue” phenomenon.
This is the name given to the temporary inability to bring a word to memory, despite the feeling that it is there, about to emerge. These tasks involve the active retrieval and production of words from our own vocabulary, similar to the processes involved in natural speech. Therefore, they allow doctors to identify deficiencies beyond what is expected from normal aging, as well as detect incipient neurodegenerative diseases.
It is important given that, although verbal fluency does not decrease significantly with normal aging (as demonstrated in a 2022 study), poor performance on these tasks can warn of neurodegenerative diseases such as Alzheimer’s. The verbal fluency test involves several brain regions related to language, memory, and executive functioning, so it can help understand which brain regions are affected by cognitive decline.
If the authors of the University of Toronto study had also delved deeper into participants’ subjective feelings when struggling to retrieve words, they could help create more powerful tools for quantifying and detecting early cognitive decline.
However, this study has opened exciting doors for future research by demonstrating that it is not just what we say, but also how quickly we say it, that can reveal cognitive changes.
Leveraging natural language processing technologies (a type of AI), which use computational techniques to analyze and understand human language data, this work advances previous studies that observed subtle changes in the spoken and written language of public figures such as Ronald Reagan and Iris Murdoch in the years before her dementia diagnosis.
While those reports were based on a retrospective look after a dementia diagnosis, the new study offers a more systematic, data-driven and future-oriented approach.
Rapid advances in our understanding of natural language processing will allow us to automatically detect changes in language, such as decreased speech rate, which could help identify people at risk before more serious symptoms appear.
Claire Lancaster is a Professor of Dementia at the University of Sussex and an Alzheimer’s researcher. Alice Stanton is a doctoral student in the same department and center. This article was originally published on The Conversation.