AI streamlines acoustic ID of beluga whales

Scientists at the National Oceanic and Atmospheric Administration who study endangered beluga whales in Alaska’s Cook Inlet used artificial intelligence to reduce the time they spend on analysis by 93%.

Researchers have acoustically monitored beluga whales in the waterway since 2008, but “acoustic data analysis is labor-intensive because automated detection tools are relatively archaic in our field,” Manuel Castellote, a NOAA affiliate scientist, told GCN. “By improving the analysis process, we would provide results sooner, and our research would become more efficient.”

The analysis typically gets hung up in the process of validating the data because detectors pick up any acoustic signal that is similar to that of a beluga whale’s call or whistle. As a result, researchers get many false detections, including noise from vessel propellers, ice friction and even birds at the surface in shallow areas, Castellote said.

A machine learning model that could distinguish between actual whale calls and other sounds would provide highly accurate validation output and “replace the effort of a human analyst going through thousands of detections to validate the ones corresponding to beluga,” he said.

The researchers used Microsoft AI products to develop a model with a deep neural network, a convolutional neural network, a deep residual network, and a densely connected convolutional neural network. The resulting detector that is an ensemble of these four AI models is more accurate than each of the independent models, Castellote said.

Here’s how it works: Twice a year, researchers recover acoustic recorders from the seafloor. A semi-automated detector has been extracting the data and processing it, looking for tones in the recordings. It yields thousands – sometimes hundreds of thousands – of detections per dataset.

The team used the collection of recordings with annotated detections -- both actual beluga calls and false positives -- that it has amassed in the past 12 years to train the AI and ML tools.

“Now, instead of having a data analyst sit in front of a computer for seven to 14 days to validate all these detections one by one, the unvalidated detection log is used by the ensemble model to check the recordings and validate all the detections in the log in four to five hours,” Castellote said. “The validated log is then used to generate plots of beluga seasonal presence in each monitored location. These results are useful to inform management decisions.”

With the significant time they’re saving, researchers can increase the number of recorders they send to the seafloor each season and focus on other aspects of data analysis, such as understanding where belugas feed based on the sounds they make when hunting prey, Castellote said. They can also study human-made noise to identify activity in the area that might harm the whales.

The team is now moving into the second phase of its collaboration with Microsoft, which involves cutting the semi-automated detector out of the process and instead applying ML directly to the sound recordings. The streamlined process will search for signals from raw data, rather than using a detection log to validate pre-detected signals.

“This allows widening the detection process from beluga only to all cetaceans inhabiting Cook Inlet,” Castellote said. “Furthermore, it allows incorporating other target signals to be detected and classified … [such as] human-made noise. … Once the detection and classification processes are implemented, this approach will allow covering multiple objectives at once in our data analysis.”

Castellote’s colleague, Erin Moreland, will use AI this spring to monitor other mammals, too, including ice seals and polar bears. A NOAA turboprop airplane outfitted with AI-enabled cameras will fly over the Beaufort Sea scanning and classifying the imagery to produce a population count “that will be ready in hours instead of months,” according to a Microsoft blog post.

The work is in line with a larger NOAA push for more AI in research. On Feb. 18, the agency finalized the NOAA Artificial Intelligence Strategy. It lists five goals for using AI, including establishing organizational structures and processes to advance AI agencywide, using AI research in support of NOAA’s mission and accelerating the transition of AI research to applications.

Castellote said the ensemble deep learning model he’s using could easily be applied to other acoustic signal research.

“A code module was built to allow retraining the ensemble,” he said. “Thus, any other project focused on different species (and soon human-made noise) can adapt the machine learning model to detect and classify signals of interest in their data.”

Specifics about the model are available on GitHub.