stub Human-Computer Interfaces Could Provide New Insight into Alzheimer’s - Unite.AI
Connect with us

Healthcare

Human-Computer Interfaces Could Provide New Insight into Alzheimer’s

Published

 on

One of the major problems with Alzheimer’s disease is that it is rarely diagnosed at an early stage, which is when it can be controlled better. Now, a team of researchers at Kaunas University of Technology (KTU) is exploring how human-computer interfaces could be adapted for people with memory impairments so they can recognize a visible object in front of them. 

Identifying Visual Stimuli

According to Rytis Maskeliūnas, a researcher at the Department of Multimedia Engineering at KTU, the classification of information visible on the face is a daily human function. 

“While communicating, the face “tells” us the context of the conversation, especially from an emotional point of view, but can we identify visual stimuli based on brain signals?” Maskeliūnas says. 

The study aimed to analyze an individual’s ability to process contextual information from the face and detect how a person responds to it. 

Maskeliūnas says that there are various studies that demonstrate that brain diseases can be analyzed by examining facial muscle and eye movements. This is because degenerative brain disorders affect both the memory and cognitive functions, as well as the cranial nervous system associated with eye movements. 

The research provides better insight into how a patient with Alzheimer’s can visually process visible faces in the brain just as individuals without Alzheimer’s. 

Dovilė Komolovaitė is a graduate of KTU Faculty of Mathematics and Natural Sciences. He co-authored the study. 

“The study uses data from an electroencephalograph, which measures the electrical impulses in the brain,” says Komolovaitė.

The experiment carried out for the study was performed on healthy individuals and those with Alzheimer’s. 

“The brain signals of a person with Alzheimer's are typically significantly noisier than in a healthy person,” says Komolovaitė. 

This makes it harder for the individual to focus when experiencing symptoms. 

The Experiment

The study involved a group of women over the age of 60.

“Older age is one of the main risk factors for dementia, and since the effects of gender were noticed in brain waves, the study is more accurate when only one gender group is chosen,” Komolovaitė continues.

Each person was shown photos of human faces during an hour-long period. The photos were selected according to different criteria. For example, neutral and fearful faces were shown when analyzing the influence of emotions. When analyzing the familiarity factor, known and randomly chosen people were shown. 

To understand if a person understands a face correctly, the participants pushed a button after each stimulus to indicate whether the face was inverted or correct. 

“Even at this stage, an Alzheimer's patient makes mistakes, so it is important to determine whether the impairment of the object is due to memory or vision processes,” Komolovaitė says. 

The study involved data from the standard electroencephalography equipment, but data gathered from invasive microelectrodes would be better for creating a practical tool. It would enable the experts to better measure the activity of neurons, which would increase the quality of the AI model. 

“Of course, in addition to the technical requirements, there should be a community environment focused on making life easier for people with Alzheimer's disease. Still, in my personal opinion, after five years, I think we will still see technologies focused on improving physical function, and the focus on people affected by brain diseases in this field will only come later,” says Maskeliūnas.

“If we want to use this test as a medical tool, a certification process is also needed,” Komolovaitė continued.

Alex McFarland is an AI journalist and writer exploring the latest developments in artificial intelligence. He has collaborated with numerous AI startups and publications worldwide.