Can AI help us talk to animals?

On the surface, it might not seem like Dr Dolittle and artificial intelligence (AI) have much in common. One belongs to the children’s literature of the 1900s, while the other has its roots in the 21st century. One is a doctor turned veterinarian who can talk to animals and the other an electronic tech who can’t. Except…

AI has already given us the ability to bark instructions at robots like Siri and Alexa – could its potential extend to the animal kingdom? Could it help us decipher some of the mysteries of the natural world and perhaps one day allow us to ‘talk’ to animals?

There are certainly some who believe so. And some progress has already been made in trying to decode animal communication using AI. It may be a way to keep you from catching up with your dog or spilling the tea with your turtle, but technology has—and hopefully will continue to—improve our understanding of other species and how they interact. As for communicating with animals, perhaps Dolittle walked (and talked) so that AI could run.

Do animals use language?

The first hurdle in “translating” animal communication is understanding what that communication looks like. Human language consists of verbal and non-verbal cues, and animal communication is no different.

Dogs wag their tails, for example, to convey a range of emotions. Bees dance to let other bees know where to find a good source of nectar or pollen. Dolphins use clicks and whistles to communicate information.

However, there is some debate as to whether this can be considered a “language”. A debate which, according to Dr. Denise Herzing, Director of Research, Wild Dolphin Project, artificial intelligence could help in bed.

“At the moment we don’t know if animals have language,” Herzing told IFLScience. “[But] Artificial intelligence can help us look for language-like structures that might indicate that animals have parts of a language.”

How can artificial intelligence “translate” animal communication?

“Research in bioacoustics has shown that animal vocalizations convey many types of information, from their identity to their condition, their internal state, and sometimes external objects or events,” Elodie F. Briefer, Associate Professor of Behavior and Animal Communication at the University of Copenhagen. he told IFLScience. “All of this could be taken over by artificial intelligence.”

More specifically, with machine learning. This is a form of AI that can analyze data without you having to follow specific instructions. In theory, it could be used to process recordings of animal communication and create language models based on those recordings.

“Machine learning is a powerful tool because it can be trained to recognize patterns in very large data sets, so it could allow us to process large volumes of data and gain critical insight into how the information contained in sounds changes of animals over time, etc.’, Briefer. added.

It’s the same technology we use every day to power predictive text, Google Translate, and voice assistants. Shifting to animal communication may prove more difficult, but that hasn’t stopped researchers from trying.

“There are many different techniques and ways of approaching science,” Herzing told IFLScience. These will vary, he added, “[depending] about data, AI technique or even understanding the animals themselves.”

The Earth Species Project, for example, is a non-profit organization “dedicated to decoding non-human language.” Their focus so far has been on cetaceans and primates, but they say they will eventually expand to other animals, including mockingbirds.

The project uses a machine learning technique, which treats a language as a shape, “like a galaxy where each star is a word and the distance and direction between the stars encodes relational meaning.” These can then be “translated by matching their structures together”.

Optimistically, Britt Selvitelle, co-founder of the Earth Species Project, believes the approach could help decode the first non-human language within the next decade, according to The New Yorker. Others, however, are more skeptical of artificial intelligence as a tool for uncovering animal communication.

It’s all well and good to analyze recordings, but they don’t make sense without context, says Julia Fischer at the German Primate Center in Göttingen. “[AI] it’s not a magic wand that gives you answers to questions of biology or questions of meaning,” he told New Scientist.

It’s still vital to look at nature and correlate recordings with actual observations, and that’s no mean feat.

Ai communicates animals

As highly social animals, cetaceans are a good starting point for trying animal conversation. Image credit: F Photography R /

What has been achieved so far?

Many projects are currently working to unlock the secrets of animal communication with the help of artificial intelligence, the Earth Species Project being one of them. Last December, the project published a paper that claims to have solved the “cocktail party problem” – the issue that arises when we distinguish the source of one sound from many simultaneous sounds.

Imagine a cocktail party, if you will. Amidst the chatter and background noise, it’s almost impossible to tell who exactly the calls for another espresso martini are coming from. And the same issue exists when deciphering animal communication.

In the study, the researchers describe an experimental algorithm – which they applied to species including macaques, bottlenose dolphins and Egyptian fruit bats – that allowed them to identify which individual in a noisy group of animals was “talking”.

Artificial intelligence is establishing itself as a valuable tool in other areas of zoology as well. “[It] it has been used primarily in a relatively new field called ‘ecoacoustics’, which monitors biodiversity through passive acoustic monitoring and requires very large datasets,” Briefer told IFLScience.

“People have also used it to extract information from long-term recordings (eg, marine mammal identification from underwater recordings). More recently, it has been used for pattern detection in other contexts as well, such as for detecting underlying emotions in [pigs and chickens].”

Briefer’s work includes such a study. She and her colleagues trained an artificial intelligence system to recognize positive or negative emotions in pigs’ grunts, squeals and squeals.

In rodents, software called DeepSqueak has been used to judge whether an animal is experiencing stress based on its ultrasonic calls. These sounds, imperceptible to the human ear, are how rodents communicate socially. The software has also been used on primates and dolphins to help researchers automatically tag recordings of the animals’ calls.

primary and communication

Artificial intelligence tries to highlight primate calls. Image credit: Gudkov Andrey /

The nonprofit Wild Dolphin Project, founded by Herzing, aims to use artificial intelligence to discover patterns in dolphin calls and explore communication between dolphins and humans. In 2013, after learning in a pod of dolphins to associate a specific whistle with a type of algae, researchers used a machine learning algorithm to recognize and translate the sound in nature.

Meanwhile, Project CETI (Cetacean Translation Initiative) is attempting to decode the communication of sperm whales using language models to decipher their songs and establish their ‘language’.

Why these animals?

No one species is superior at decoding communication, Briefer believes, but even so, some have been targeted by researchers more than others.

“When we look at acoustic communication, of course the most interesting ones are those that are highly vocal (e.g. birds, pigs, meerkats, etc.) and those that have a large sound repertoire,” Briefer told IFLScience.

Similarly, social animals such as primates, whales and dolphins are more likely to have well-developed communication systems, which makes them ideal for study.

“Dolphins live in highly social societies, live long lives and have long memories, which suggests they have complex relationships to communicate,” Herzing explained. Intelligence can also play a role.

“Cetaceans, or at least dolphins, are known to have a high EQ [emotional intelligence], abilities to learn artificial languages, understand abstract ideas, and recognize themselves in a mirror,” Herzing added. “These are some of the foundations of intelligence.”

What are the benefits of understanding animal communication?

Beyond the obvious – finally finding out what your cat thinks about you – there are many ways in which an improved understanding of animal communication can be beneficial, for both humans and animals.

“For both captive and wild species, it allows us to understand them better and know when they are thriving or suffering,” Briefer told IFLScience. “This is vital for the species we have around us (pets and farm animals for example) as their well-being depends on us.”

Not only could this make us better pet owners, but it has the potential to change our relationship with all animals forever. “By knowing that animals have language, we hope that humans will understand that we are not the only sentient species on the planet,” Herzing added.

At the very least, it could inspire more sympathy for other species and lead us to rethink the way we treat them. This could have far-reaching implications, including the use of animals in sport, entertainment and research.

It could even cause a complete overhaul of livestock farming. With a better understanding of the animals around us, can we still justify practices that exploit and kill them? From a human perspective, there is also much we could learn, not only about animals but about ourselves and, perhaps, other life forms.

Understanding animal communication could also teach us about the evolution of language, Briefer told IFLScience. “The tools we develop with species on Earth may be applicable to distant worlds if we encounter other life forms,” ​​Herzing speculated, adding that these tools could help us measure their intelligence and whether we could communicate. with them.

Making the leap from animal to alien communication is definitely a move Dolittle never made. Should the fictional vet worry about being defeated? Not yet. Using AI in reality talk for animals, let alone aliens, it’s a big leap. But it certainly has the potential to improve our understanding of other species and the world around us. In fact, it already has. Maybe Dolittle is finally getting a little nervous.

This article first appeared in Issue 3 of the CURIOUS online journal, IFLScience. Sign up now to get every issue for free, delivered straight to your inbox.

Leave a Reply

Your email address will not be published. Required fields are marked *