top of page

Can we learn to talk to animals? FPC and Earth Species Project are funding research to find out



  • FootPrint Coalition's Science Engine and the Earth Species Project are funding research investigating how AI can help understand animal communication

  • The new initiative will issue fast-grants of up to $10,000 to scientists in different areas of research funded through the Experiment.com platform

  • ESP Senior AI Research Scientists Maddie Cusimano and Sara Keen are looking to fund a wide range of projects from foundational research to conservation applications.

  • The work could be anything from dataset design to collection and processing for machine learning, field studies of interspecies communication or developing new ways to investigate non-human signals


What if we could truly understand what the animals around us were saying?


Every living thing on the planet has ways of communicating. Decoding those signals could unlock new understanding of how humans can live more harmoniously with the natural world around us.


At least that's the goal of a new funding initiative from Earth Species Project and FootPrint Coalition's non-profit Science Engine.


Founded in 2017, the Earth Species Project is a non-profit research lab focused on exploring how artificial intelligence can decode non-human communication.


The group, working with the non-profit arm of FootPrint Coalition (Robert Downey Jr.'s initiative to accelerate technologies creating a world of sustainable abundance), is launching a new grant program to fund early research into interspecies communication.


“The emerging field of AI for interspecies communication presents remarkable opportunities to reimagine our relationship with other species by providing insights into how they relate to each other – and to us,” said Rachel Kropa, FootPrint Coalition Managing Director and head of non-profit, in a statement. “Beyond increasing our understanding of non-human behavior and cognition, this work with our partners at the Earth Species Project also has the potential to support the development of more effective conservation strategies.”


Since its launch nearly 6 years ago, Earth Species Project has been steadily building out research in the field under the leadership of its Chief Executive Officer, Katie Zacarian.


The new projects will leverage machine learning to improve the ways humans perceive and understand the world, the group said.

“Decoding non-human communication is a huge challenge. It will only be solved through cross-disciplinary collaboration across a wide range of fields, from machine learning to neuroscience, ethology and biology," said Zacarian.

The projects will be listed and funded through Experiment.com, which uses crowdfunding campaigns to finance and advance basic and applied research.


Alongside FootPrint Coalition's award-winning Science Engine, ESP will support several projects with up to $10,000 in fast-grant funding.

ESP Senior AI Research Scientists Maddie Cusimano and Sara Keen are leading the distribution of funds and the two are looking for a wide range of projects, including foundational research, creative explorations and conservation applications.


The work could range from dataset design, collection and processing for machine learning, field studies of interspecies communication or developing new machine learning methods to investigate non-human signals or facilitate species interaction, according to ESP and FootPrint Coalition.


ESP is engaged in a number of research projects that represent key steps on the organization’s technical roadmap toward decoding non-human communication. Most recently, they have published BEANS: Benchmark for Animal Sounds, the first-ever benchmark dataset for animal vocalizations, and the BEBE: Behavioral Benchmark for the self-supervised discovery of ethograms from animal movement data, co-authored with 15 research scientists from institutions around the world. These benchmarks are key to measuring progress in a newly emerging field.


ESP’s roadmap also identifies the importance of developing foundation models similar to the Large Language Models (LLMs) that have recently become dominant in human language processing. Trained on large amounts of data in a self-supervised manner, foundation models can perform difficult predictive tasks and are useful for domains with less annotated data such as animal communication. ESP’s AVES: Animal Vocalization Encoder based on Self-Supervision was published earlier in 2023 and represents the first-ever self-supervised, transformer-based foundation model for animal vocalizations.


ESP’s research team is working with partners to explore the communication systems of species ranging from crows to beluga whales, and providing insights that can support the development of more effective conservation strategies. They are also working on experiments with real-time AI-Animal “conversation” in collaboration with biologist partners in laboratory settings. This represents an important step in decoding communication because two-way interactive communication offers a powerful tool towards inferring meaning.


We're not quite at the stage where everyone can talk to animals, but using machine learning tools will open up new opportunities for humans to learn from the natural world around us.


"We are very excited to be able to work with FootPrint Coalition to provide catalytic funding that will both inspire new research and deepen existing research, driving the whole field forward," Zacarian said.


Comentários


bottom of page