homehome Home chatchat Notifications


Can AI finally show us how animals think?

Can science help you talk to your dog?

Shelley Brady
September 2, 2025 @ 10:37 am

share Share

adorable dog on pink background
Dogo is suspicious. Image via Victor G.

How is an animal feeling at a given moment? Humans have long recognized certain well-known behaviour like a cat hissing as a warning, but in many cases we’ve had little clue of what’s going on inside an animal’s head.

Now we have a better idea, thanks to a Milan-based researcher who has developed an AI model that he claims can detect whether their calls express positive or negative emotions. Stavros Ntalampiras’s deep-learning model, which was published in Scientific Reports, can recognise emotional tones across seven species of hoofed animals, including pigs, goats and cows. The model picks up on shared features of their calls, such as pitch, frequency range and tonal quality.

The analysis showed that negative calls tended to be more mid to high frequency, while positive calls were spread more evenly across the spectrum. In pigs, high-pitched calls were especially informative, whereas in sheep and horses the mid-range carried more weight, a sign that animals share some common markers of emotion but also express them in ways that vary by species.

For scientists who have long tried to untangle animal signals, this discovery of emotional traits across species is the latest leap forward in a field that is being transformed by AI.

The implications are far-reaching. Farmers could receive earlier warnings of livestock stress, conservationists might monitor the emotional health of wild populations remotely, and zookeepers could respond more quickly to subtle welfare changes.

This potential for a new layer of insight into the animal world also raises ethical questions. If an algorithm can reliably detect when an animal is in distress, what responsibility do humans have to act? And how do we guard against over-generalisation, where we assume that all signs of arousal mean the same thing in every species?

Of barks and buzzes

Tools like the one devised by Ntalampiras are not being trained to “translate” animals in a human sense, but to detect behavioural and acoustic patterns too subtle for us to perceive unaided.

Similar work is underway with whales, where New York-based research organisation Project Ceti (the Cetacean Translation Initiative) is analysing patterned click sequences called codas. Long believed to encode social meaning, these are now being mapped at scale using machine learning, revealing patterns that may correspond to each whale’s identity, affiliation or emotional state.

In dogs, researchers are linking facial expressions, vocalisations and tail-wagging patterns with emotional states. One study showed that subtle shifts in canine facial muscles correspond to fear or excitement. Another found that tail-wag direction varies depending on whether a dog encounters a familiar friend or a potential threat.

At Dublin City University’s Insight Centre for Data Analytics, we are developing a detection collar worn by assistance dogs which are trained to recognise the onset of a seizure in people who suffer from epilepsy. The collar uses sensors to pick up on a dog’s trained behaviours, such as spinning, which raise the alarm that their owner is about to have a seizure.

The project, funded by Research Ireland, strives to demonstrate how AI can leverage animal communication to improve safety, support timely intervention, and enhance quality of life. In future we aim to train the model to recognise instinctive dog behaviours such as pawing, nudging or barking.

Honeybees, too, are under AI’s lens. Their intricate waggle dances – figure-of-eight movements that indicate food sources – are being decoded in real time with computer vision. These models highlight how small positional shifts influence how well other bees interpret the message.

Caveats

These systems promise real gains in animal welfare and safety. A collar that senses the first signs of stress in a working dog could spare it from exhaustion. A dairy herd monitored by vision-based AI might get treatment for illness hours or days sooner than a farmer would notice.

Detecting a cry of distress is not the same as understanding what it means, however. AI can show that two whale codas often occur together, or that a pig’s squeal shares features with a goat’s bleat. The Milan study goes further by classifying such calls as broadly positive or negative, but even this remains using pattern recognition to try to decode emotions.

Emotional classifiers risk flattening rich behaviours into crude binaries of happy/sad or calm/stressed, such as logging a dog’s tail wag as “consent” when it can sometimes signal stress. As Ntalampiras notes in his study, pattern recognition is not the same as understanding.

One solution is for researchers to develop models that integrate vocal data with visual cues, such as posture or facial expression, and even physiological signals such as heart rate, to build more reliable indicators of how animals are feeling. AI models are also going to be most reliable when interpreted in context, alongside the knowledge of someone experienced with the species.

It’s also worth bearing in mind that the ecological price of listening is high. Using AI adds carbon costs that, in fragile ecosystems, undercut the very conservation goals they claim to serve. It’s therefore important that any technologies genuinely serve animal welfare, rather than simply satisfying human curiosity.

Whether we welcome it or not, AI is here. Machines are now decoding signals that evolution honed long before us, and will continue to get better at it.

The real test, though, is not how well we listen, but what we’re prepared to do with what we hear. If we burn energy decoding animal signals but only use the information to exploit them, or manage them more tightly, it’s not science that falls short – it’s us.


Shelley Brady, Postdoctoral Researcher in Animal Behaviour, Assistive Technology and Epilepsy, Dublin City University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

share Share

A Rare Condition Made a Woman See Dragons Instead of Human Faces

It's one of the weirdest conditions.

America’s Sex Ed System Is An Anti-Science Nightmare

Only 37% of US states require sex ed to be medically accurate.

Turns Out, You're Not Supposed to Rinse After Brushing Your Teeth

We've been living a lie.

3D-printed 'ghost guns' are surging in popularity and law enforcement is struggling to keep up

The use of 3D-printed guns in criminal and violent activities is likely to continue to increase. And governments and police will continue to have trouble regulating them.

Miss England Contestants Are Now Competing With AI Versions of Themselves

AI models are coming to the internet, whether you like it or not.

ChatGPT only talks in clichés. That’s a threat to human creativity

When you chat with ChatGPT, it often feels like you’re talking to someone polite, engaged and responsive. It nods in all the right places, mirrors your wording and seems eager to keep the exchange flowing. But is this really what human conversation sounds like? Our new study shows that while ChatGPT plausibly imitates dialogue, it […]

This 3D printed circuit board that dissolves in water could finally solve our E-waste problem

This study is putting forward an alternative to our notoriously hard to recycle circuit boards.

Climate Change Triggered European Revolutions That Changed the Course of History

Severe volcanic eruptions may have set the stage for several revolutions.

Inside Palantir: The Secretive Tech Company Helping the US Government Build a Massive Web of Surveillance

Government agencies are contracting with Palantir to correlate disparate pieces of data, promising efficiency but raising civil liberties concerns.

This Chihuahua Munched on a Bunch of Cocaine (and Fentanyl) and Lived to Tell the Tale

This almost-tragic event could have a very useful side.