Researchers at the University of Colorado and Duke University have developed a neural network to accurately decode images into 11 different human emotion categories. The research team at the universities included Phillip A. Kragel, Marianne C. Reddan, Kevin S. LaBar, and Tor D. Wagner.
Phillip Kragel explains neural networks as computer models that are able to map input signals to an output of interest by learning a series of filters. Whenever a network is trained to detect a certain image or thing, it learns the different features that are unique to it like shape, color, and size.
The new convolutional neural network has been named EmoNet, and it was trained on visual images. The research team used a database that had 2,185 videos and included 27 different emotion categories. From the collection of videos, they extracted 137,482 frames that were divided into training and testing samples. They were not just basic emotions, but they included many complex ones as well. The different emotion categories included anxiety, awe, boredom, confusion, craving, disgust, empathetic pain, entrancement, excitement, fear, horror, interest, joy, romance, sadness, sexual desire, and surprise.
The model was able to detect some emotions like craving and sexual desire at a high confidence interval, but it had trouble with other emotions such as confusion and surprise. To categorize the different images and emotions, the neural network used things such as color, spatial power spectra, and the presence of objects and faces in the images.
In order to build on the research and the neural network, the team studied 18 different people and their brain activity after showing them 112 different images. After showing the real humans the images, the researchers showed the same ones to the EmoNet network to compare the results between the two.
We already use certain apps and programs every day that read our faces and expressions for things like facial recognition, photo manipulation through AI, and to unlock our smartphones. This new development takes that a lot further with the possibility of not only reading a face’s physical features, but now reading a person’s emotions and feelings through their faces. It is an exciting but also concerning development as privacy concerns will surely arise. We already worry about facial recognition and what can happen with that data.
Aside from the dangerous potential regarding privacy concerns, this new technological development can help in many areas. For one, many researchers often rely on participants reporting on their own emotions. Now, researchers can use the image of that participant’s face to learn their emotions. This will reduce the errors in the research and data.
“When it comes to measuring emotions, we’re typically still limited only to asking people how they feel,” said Tor Wagner, one of the researchers on the team. “Our work can help move us towards direct measures of emotion – related brain processes.”
This new research can also help transition mental health labels like “anxiety” to brain processes.
“Moving away from subjective labels such as ‘anxiety’ and ‘depression’ towards brain processes could lead to new targets for therapeutics, treatments, and interventions.” said Phillip Kragel, another one of the researchers.
This new neural network is just one of the new and exciting developments in artificial intelligence. Researchers are constantly pushing this technology further, and it will make an impact in every area of our lives. The new developments in AI are taking it deeper into the different areas of human behavior and emotion. While we mostly know of AI dealing in the physical realm including muscles, arms, and other parts of the body, we are now going into the human psyche with the technology.