A group of researchers at Tohoku University has used a neural network model that reproduces the brain on a computer to gain insight into why people with autism spectrum disorder have difficulty interpreting facial expressions.
The research was published in the journal Scientific Reports on July 26, 2021.
Recognizing Different Emotions
Yuta Takahashi is co-author of the paper.
“Humans recognize different emotions, such as sadness and anger by looking at facial expressions. Yet little is known about how we come to recognize different emotions based on the visual information of facial expressions,” said Takahashi.
“It is also not clear what changes occur in this process that leads to people with autism spectrum disorder struggling to read facial expressions,” Takahashi continued.
Predictive Processing Theory
The research group relied on predictive processing theory, which says the brain constantly predicts the next sensory stimulus. When the prediction is wrong, the brain then adapts itself, and sensory information like facial expressions help reduce prediction error.
The artificial neural network model developed by the team uses the predictive processing theory, and it was able to reproduce the developmental process. It did this by training itself to predict how parts of the face would move in videos of facial expression.
The next step was to self-organize the clusters of emotions into the neural network model’s higher level neuron space. At the same time, the model did not know which emotion the facial expression in the video corresponded to.
The model was also able to generalize unknown facial expressions that were not given in the training, as well as reproduce facial part movements while minimizing prediction errors.
The team of researchers then induced abnormalities in the neurons’ activities during experiments, which helped provide insight into the effects on learning development and cognitive characteristics. The experiments demonstrated that generalization ability decreased in the model where heterogeneity of activity in the neural population was reduced. This suggested the formation of emotional clusters in higher-level neurons was inhibited, and it led to the neural network model having a tendency to fail in identifying the emotion of unknown facial expressions, which is also a symptom of autism spectrum disorder.
Takahashi says the study suggests predictive processing theory can explain emotion recognition from facial expressions using a neural network model.
“We hope to further our understanding of the process by which humans learn to recognize emotions and the cognitive characteristics of people with autism spectrum disorder,” Takahashi said. “The study will help advance developing appropriate intervention methods for people who find it difficult to identify emotions.”