Two birds (zebra finches) are sitting on a tree branch. A voice analysis graph is overlayed, illustrating scientific efforts to decode the language of animals.
Earth Species Project/McGill University

Can AI Unlock the Secrets of Animal Talk?

With machine learning systems trained on the sounds of nature, the Earth Species Project aims to decode how different species communicate. What can we expect of an “animal GPT”?

What do birds tell each other when they tweet? Are whales singing of love and loss? How do monkeys name their children? Scientists have been studying the many ways in which animals communicate for decades – any while they’ve gained some insights, many open questions remain.

Now artificial intelligence is giving the field of decoding animal communication a boost. Will algorithms that are able to understand human language also be able to understand how animals speak with each other?

The Earth Species Project is developing AI systems that support human researchers in their quest to decipher the language of nature. The non-profit organization’s Director of Impact, Jane Lawton, explains how the Earth Species Project is building an “animal GPT” (akin to AI systems like ChatGPT), how this promises to benefit biodiversity, and why a chit-chat with wild animals – even if possible one day – might not be such a good idea after all.

Outdoor portrait of Jane Lawton, Director of Impact at the Earth Species Project, sitting on a bench in a park.

Jane Lawton is Director of Impact for Earth Species Project, a research lab and impact organization focused on using AI to decode animal communication, with the goal of changing the way human beings relate to nature. Her diverse experience has ranged from designing conservation programs in Africa with the Jane Goodall Institute to setting global standards for conservation with IUCN in Asia.

What do we know about the way animals communicate?

Researchers have been studying this subject for decades, and the things we know already begin to point to much more sophisticated levels of language and cognitive ability than many people would expect. Back in the 1980s, scientists discovered, for example, that vervet monkeys have different alarm calls for an eagle versus a leopard versus a snake, because the responses need to be different. You’ve got to climb a tree for a leopard. You’ve got to move away quickly for a snake. You’ve got to hide under something big for an eagle.

Recently, researchers have discovered that both elephants and marmosets have unique names for each other. Orangutans can speak about dangers not currently present in the past tense. Even coral larvae can use sound to navigate towards a healthy reef. All of this validates our assumption that animal communication is much more complex than we thought.

How can AI lift this research to the next level?

At its most basic level, artificial intelligence is going to help us analyze data and find patterns in the data. The reason we think this is possible is because AI has progressed to the stage where it can learn human languages. It can predict the next word in a sentence and also translate across different modalities – for example take a text prompt, understand the meaning, and then produce a beautiful picture.

These developments give us hope that we can begin to understand animal communication, because if we can translate human languages without dictionaries, then essentially the model can start to learn any kind of language with the right training data. We are now working to harness these developments and apply them to animal communication. The goal is to help the researchers who have been learning these incredible things about other species, in order to analyze the vast amounts of data that are now being generated.

An elephant with its calf, via Pexels/Gerbert Voortman

Elephant calls: We don’t know (yet) what the mother named her child – but we do know that elephants give each other names, just like humans do. (Photo: Gerbert Voortman/Pexels)

Where does the data come from?

Our specialty is building machine learning models, so we don’t collect data in the field ourselves. We work closely with researchers who are studying a number of different species and gathering data in the field. We’ve done work with elephants. We’ve done work with orangutans, trying to understand the impact of forest fires and smoke inhalation on the way they communicate. We have also done research on beluga whales, trying to understand the social structure of endangered populations through their vocalizations. One unusual species that we’ve been studying with a researcher is jumping spiders, analyzing how they communicate and the importance of that to their breeding behavior.

But spiders surely don’t talk?

No, but they create vibrations across a substrate when they move – that’s essentially their vocalization, their sound, their way of communicating. They also do very intricate dances. So our research partner at the University of California at Berkeley records their vibrations to understand how those communication patterns impact their breeding behavior.

Revisit DLD Nature

DLD Nature brought together experts from science, business, politics and culture to present solutions for biodiversity loss, protecting nature and transitioning to a sustainable, future-forward economy. For videos, images and related articles please visit our conference page.

What insights have you gained?

We’re seeing incredible improvements in the ability to analyze data rapidly at scale for tasks that researchers in the animal domain have been struggling with for a long period of time. Our goal is to develop a kind of animal GPT, which is trained on vast amounts of data and can be applied to many different tasks.

We hope to lay the foundation of a new field that we are currently calling “animal language processing.” We have already built the first ever foundation model for animal vocalizations, called AVES. It works really, really well on classification and detection. We’ve now moved on to a new iteration, BirdAVES – a model trained specifically on bird data. This has been scaled up significantly and is performing even better.

What will this “animal GPT” be able to do?

The first iteration of animal GPT will allow researchers to query their bioacoustic data in natural language to detect vocalizations and classify them according to things like species and even age and sex. Our goal with this first version is to prove that our models can perform as well or better than existing tools while also performing some novel tasks that aren’t possible today.

Can we expect to talk to animals at some point, like we can with humans?

We don’t know exactly but we feel that this is unlikely to be a Doctor Dolittle moment where we are able to have a fluent conversation with another species. Our organization is really about finding ways to listen and understand, rather than to provide human beings with an opportunity to speak directly to other species. Because we’re not sure that would be beneficial to those other species.

Chart from an Earth Species Project video explains how AVES works

The technology behind the insights: If you want to understand the technical details, take a look at the AVES explainer video on the Earth Species Project’s Youtube channel.

Why not? What would be wrong with that idea?

First off, it seems to service mostly a human need – not something that the animals require or desire. But more importantly, there are real risks here. We are going to arrive fairly quickly at a point where a machine learning model might be able to generate a novel vocalization and potentially get into a fluent two-way exchange with another species. But the likelihood is that we as human beings will not be able to understand the meaning of that exchange.

That brings up a host of ethical concerns. A system like that could be used by poachers, for example, or it could be used by people trying to control animals more closely in factory farming settings. These are not the things we’re aiming for. We are aiming for the ability to understand and to listen, because we believe that could be transformative in the way human beings think about their place in the world.

Where do you see direct benefits for animal conservation?

Understanding the language of other animals is not going to solve for everything. But it is a window into the sophistication and wonder of the natural world. There can be real, immediate benefits for conservation. If we manage to accelerate the workflows of researchers who are studying other species, they gain a better understanding of those species. They can better design conservation strategies that actually fit with the culture of that species.

As fascinating as your work is, why should people care if they’re not particularly interested in nature?

For us, the ultimate goal of this work is to find ways to reconnect people with nature. It’s really to help push human beings toward a point where we understand that we’re part of nature and not separate from it. That is important because we are facing existential climate and biodiversity crises. Something is broken in our relationship with the planet, which is driving many other species toward extinction – including, some experts say, potentially the human race as well.

So finding ways to better understand the complexity and the interconnectedness of everything that is happening on the planet is actually vital to our own survival as a species. And our work is a way to do that, to understand that complexity and to reconnect.

Jane Lawton (Earth Species Project) speaks with Maria Furtwängler (MaLisa Foundation) at the DLD Nature conference 2024.

New Ways of Understanding Nature

Watch Jane Lawton in conversation with Maria Furtwängler, Mara-Daria Cojocaru and Katrin Vohland at the DLD Nature conference.
Watch
By loading the video you agree to the privacy policy of Youtube.

Related Articles

Matthew Gould, CEO of the Zoological Society of London, smiling, sitting on a bench surrounded by nature.

Biodiversity: “We Have to Put a Price on Natural Capital”

Preserving the planet’s resources is not just essential to our own survival, it will be good business to lead the way, Matthew Gould, CEO of the Zoological Society of London, says.
Karen Bakker, author, researcher of digital innovation and environmental governance at the University of British Columbia

Decoding the Hidden Sounds of Nature

Author and researcher Karen Bakker explores how technology helps us understand the way animals and plants communicate – and why that’s vital to human life on Earth as well.
magnifiercrosschevron-downmenu-circle
linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram