If you watch and love (as much as I do) the Amazon production of “The Boys,” the series about a world populated with deeply flawed superheroes, you’re no doubt familiar with the character called The Deep, the underwater-breathing, talk-to-the-fishies, self-involved numbskull who is pretty, but dim-witted.
The Deep is also wracked by self-doubt, as in this S1E4 exchange with his therapist:
The Deep: I mean, yeah, I can talk to fish. So what? How often do you need to be saved by a school of salmon?
Psychiatrist: Kevin, that’s just not true. Where would that Carnival cruise ship be without you?
The Deep: Yeah, I know.
Deep’s ability to talk to the animals presents him as a sort of perverted aquatic Dr. Doolittle.
That kind of animal-to-human two-way communication may never happen. But thanks to machine learning, we might not be that far off from understanding what some animals are saying to each other, as this New York Times article by Emily Anthes explains:
Machine-learning systems, which use algorithms to detect patterns in large collections of data, have excelled at analyzing human language, giving rise to voice assistants that recognize speech, transcription software that converts speech to text and digital tools that translate between human languages.
In recent years, scientists have begun deploying this technology to decode animal communication, using machine-learning algorithms to identify when squeaking mice are stressed or why fruit bats are shouting. Even more ambitious projects are underway — to create a comprehensive catalog of crow calls, map the syntax of sperm whales and even to build technologies that allow humans to talk back.
“Let’s try to find a Google Translate for animals,” said Diana Reiss, an expert on dolphin cognition and communication at Hunter College and co-founder of Interspecies Internet, a think tank devoted to facilitating cross-species communication.
The field is young and many projects are still in their infancy; humanity is not on the verge of having a Rosetta Stone for whale songs or the ability to chew the fat with cats. But the work is already revealing that animal communication is far more complex than it sounds to the human ear, and the chatter is providing a richer view of the world beyond our own species.
I find it really intriguing that machines might help us to feel closer to animate life, that artificial intelligences might help us to notice biological intelligences,” said Tom Mustill, a wildlife and science filmmaker and the author of the forthcoming book, “How to Speak Whale.” “This is like we’ve invented a telescope — a new tool that allows us to perceive what was already there but we couldn’t see before.”
Studies of animal communication are not new, but machine-learning algorithms can spot subtle patterns that might elude human listeners. For instance, scientists have shown that these programs can tell apart the voices of individual animals, distinguish between sounds that animals make in different circumstances and break their vocalizations down into smaller parts, a crucial step in deciphering meaning.
Interesting article that you can read in its entirety here.
