- by Tannya Birajdar (PAP Cohort-2)
Communicating underwater is challenging. Light and
odours don’t travel very well, but sound moves about four times faster in water
than in air — which means marine mammals often use sounds to communicate. The
most famous of these underwater vocalizations is undoubtedly the whale song.
Whales are extremely
smart creatures, their brains being the biggest of any species – 6 times larger
than us humans! Additionally, they are also very social – using bursts of
clicks, whistles and pulsed calls (also known as codas) to communicate.
Whale song is known
to travel up to 1000km at a time at 230db at most, transmitting their messages to
other pods. We have even discovered that songs vary in different families,
passing them down to future generations, suggesting a form of cultural learning.
Not
only are whale vocalisations extremely loud, but they are also incredibly
organized. They may not sound like much — but when slowed down and viewed as a
sound wave on a spectrogram, each click reveals an incredibly complex
collection of shorter clicks inside it, with even shorter clicks inside, and so
on. The more closely we focus in on a click, the more detailed it becomes.
Whales use clicks not only to interact but also for
echolocation, helping them navigate the ocean and locate their prey, which
mainly consists of either squid or plankton. Whales’ highly developed neocortex
(responsible for attention, thought, perception) and strong social bonds,
suggests that their communication may be far more advanced than previously thought.
For decades, scientists have
long known that whales use a range of vocalizations, but understanding the
complexity behind these sounds has remained a challenge.
But what if there is a way to understand
their language?
Machine learning algorithms have revealed that whales
are able to adjust the timing and sequence of clicks depending on social
context, suggesting that whale communication may be adaptive depending on the
context.
Marine biologists,
animal communication experts and artificial intelligence innovators have come
together, using models like ChatGPT to find ways to interpret and respond to
their language.
But why should we
bother? Why is it so important for us to understand a language we haven’t
decoded for centuries?
This research is
obviously not just going to be used for the sake of satisfaction, the happiness
one gets after solving a new puzzle. Over recent years, we’ve realised that
whales have been much quieter, their songs no longer heard as often as before.
This is because of the increase of human noise pollution from boats, underwater
surveillance and oil tankers. It is observed that whales won’t communicate if
they hear this cacophony from even 200 miles away.
This disrupts their
way of telling other whales good feeding grounds, or whether there is danger
ahead - disrupting ecosystems and their survival. This along with the massive
data sets of their complex structures makes it a challenge to understand what
they are saying.
Understanding whale
song not only opens the boundaries for how we can protect the ocean without
disrupting its balance, as well as learning how these mammals adapt to their
environment and what we can do to make it easier.
Scientists are deploying
underwater microphones (hydrophones), robotic fish, and tags to record whale
vocalisation, and then is all stored in machines to be translated with the help
of artificial intelligence. Respected marine ecologist
Dr Carlos Duarte says the big data collected in supercomputers and AI tools
such as ChatGPT and Tron have seen the international science community
fast-track to decipher the complex vocalisations of whales, dolphins, and other
cetaceans. However, he says understanding the language is still a process in
progress.
“Scientists now play back those
whale words, and the whales actually answer, but we don’t have any idea what
we’re saying,” Duarte says.
In
addition to this, a team of researchers led by Pratyusha Sharma at MIT’s
Computer Science and AI Lab (CSAIL) working with Project CETI (a nonprofit
focused on using AI to understand whales) used statistical models to analyse
whale codas and managed to identify a structure to their language that’s like
features of the complex vocalisation's humans use.
By training AI on huge datasets of whale sounds,
scientists have been using neural networks to identify subtle variations in patterns
that might represent different words/meanings for things like mating, foraging,
etc. This opens a whole new world to how whales use grammar and syntax to form
sentences and songs.
We have also found out that Google's bioacoustics AI
model (2024), can distinguish between the vocalisations of multiple whale
species, helping us finding out more than just the communicator. They have discovered that sperm
whales may use combinatorial coding—combining different clicks and pauses to
create more complex phrases.
Understanding and translating whale song into something humanly understandable aims at helping to identify their movements to protect them from ship strikes and reduce sonar that interferes with their echolocation.
With the knowledge we
are now receiving from this extensive research, we may be able to get valuable
insights into whale communication, which could completely change how
we understand not just marine life, but the very nature of language and
intelligence itself.
We will be able to monitor whale populations, gauge
ocean health, decode complex behaviours, and protect one of the largest and the
most majestic mammal species on this planet, taking us one step forward into a
world of innovation, creativity, and into a future of a connected and
harmonious world.
And to know, it can all be done with the same
technology you use to solve your homework.