According to Nature, a team of researchers has recently employed artificial intelligence to analyze and interpret the facial expressions of mice. Laboratory mice are some of the most commonly used laboratory animals, but little is known about how they express themselves with their faces. The research could also help scientists understand which neurons impact specific facial expressions in humans.
The study of animal expressions is an old idea but a relatively new discipline. Darwin initially hypothesized that animal facial expressions might grant us insight about their emotions, but just recently has science and technology advanced ot the point where it is possible to study such expressions and emotions.
David Anderson, a neuroscientist at the California Institute of Technology in Pasadena, explained that the study was an important step in demystifying how the brain manifests certain emotions and how those emotions might be expressed in facial muscles. Meanwhile, Nadine Gogalla, a neuroscientist at the Max Planck Institute of Neurobiology in Germany, explained the rational behind the study. Gogalla led the study and was inspired by a 2014 paper written by Anderson and colleagues. In their paper, Anderson and colleagues hypothesize that emotions and other brain states should display certain mensurable attributes, theorizing that the strength of the stimulus should impact the severity of the emotion and that emotions should be persistent, continuing on for a while even after the stimulus responsible for them has ended.
As Inverse explained, Gogolla and the other researchers filmed the faces of mice as they were exposed to a variety of stimuli, both pleasant and unpleasant. For instance, they were given either bitter or sweet fluids. The researchers stated that mice can shift their expressions by altering facial structures like nose, eyes, ears, and cheeks. However, there wasn’t a method of easily linking different facial expressions to different emotions. The research team dealt with this problem by taking the videos of the mice faces and splitting them up into short clips, which were then fed into a machine learning algorithm.
Camilla Bellone at the University of Geneva in Switzerland, says that the AI driven method of examining facial expressions is valuable “because it avoids any biases of the experimenter”.
The AI algorithm was reportedly able to recognize the various facial expressions of the mice, as movement of different facial muscles are correlated with different emotions. A mouse shows that it is experiencing pleasure by pulling its jaw and ears forward and pulling the tip of the nose downward towards the mouth. Moreover, when analyzing how the expressions manifested in response to stimuli, the research team found the expressions were both persistent and correlated with the stimuli strength, just as Anderson and colleagues theorized.
The team of researchers then used a technique dubbed optogenetics to try and determine which brain cells are responsible for these emotions. The research team examined the individual neural circuits associated with certain emotions in animals. When these circuits were stimulated, the mice made the corresponding facial expressions.
The research team also utilized a technique referred to as two-photon calcium imaging, which can track individual neurons. Using this technique they identified neurons in the brains of the mice that activated only when certain facial expressions, and therefore emotions, were witnessed. Gogolla theorized that these neurons might represent part of a coding for emotions in the brain, an encoding that possible conserved throughout the evolutionary history of mammals, and therefore mice and humans may share some common features in this encoding.