Could a computer be programmed to feel emotions? We’re not sure; but we now know that we can teach it how to recognize them from the facial expressions of mice.
Researchers at the California Institute of Technology in Pasadena have developed an artificial intelligence system that can monitor the facial expressions of mice to understand whether they are feeling any pain. The research could have direct applications in helping researchers better judge the effectiveness of painkillers during animal trials and could help us understand the neural processes that encode certain facial expressions for humans.
The development of this AI is “an important first step” in understanding the aspects of emotion and how they arise in the brain that are still unknown, the researchers add.
“I was fascinated by the fact that we humans have emotional states which we experience as feelings,” says neuroscientist Nadine Gogolla at the Max Planck Institute of Neurobiology in Martinsried, Germany, who led the three-year study. “I wanted to see if we could learn about how these states emerge in the brain from animal studies.”
While it may feel like an intuitive truth to many of us, the idea that animals could showcase their emotions through facial expressions was first introduced in academic circles around 150 years ago by Charles Darwin. But, for the longest time since then, this idea had to stay on the level of a hypothesis — we simply didn’t have a reliable means to capture and analyze the facial motions of model animals or to tie them to neural activity.
For the study, the team fixed mice in place such that their heads were kept still, then introduced different sensory stimuli to the animals to trigger particular emotions. Examples of stimuli include placing sweet or bitter fluids on the mice’s lips to evoke emotions such as pleasure or disgust. Pain was produced by delivering small electric shocks to the tail, and feelings of malaise were produced by injecting the animals with lithium chloride.
Features that the team monitored during this time included the movment of the mouse’s ears, cheeks, nose, and the upper part of its eyes. Despite this, they couldn’t reliably determine which emotions were conveyed by different expressions.
Because of this, they opted to break down the video recordings of the facial movements into ultra-short snapshots as the animals responded to the different stimuli. These were then fed through a machine-learning algorithm to teach them how to recognize distinct expressions. They then learned to correlate these expressions with emotional states, which were estimated based on the type of stimuli the animals were exposed to.
Some of the findings the AI produced were that a mouse experiencing pleasure would pull its nose down towards the mouth, and pulls its ears and jaw forwards. When feeling pain, it would pull back its ears, bulk out its cheeks, and sometimes squint. The expressions identified were persistent, the team explains, and their strength correlated with the intensity of the stimulus.
The use of AI for this type of research goes a long way toward removing any biases that may interfere with the interpretation of the emotions each animal might be feeling, the team adds.
Using a technique called optogenetics, which can be used to cause genetically-modified neurons to light up when they activate or become activated when a strong light is shone upon them, researchers examined where these emotions were formed in the brain. The team targeted individual neural circuits that have been linked to the formation of particular emotions in both humans and mice, to induce these emotional states in the mice. When stimulated this way, the mice assumed the relevant facial expressions, validating the results of the previous steps.
Finally, the team used a specialized technique to observe individual neurons in the mouse brain that were activated only while particular emotions and facial expressions were evoked. This further step confirmed their results, and could help to pinpoint the specific neurons that govern the formation of emotional states and their associated facial expressions.
“They may represent part of a coding for emotions in the brain,” speculates Gogolla. “We think that encoding for emotion may be evolutionary conserved, and so the encoding in humans and mice may share some common features.”
The paper “Artificial intelligence decodes the facial expressions of mice” has been published in the journal Nature.