This article is part of our latest Artificial Intelligence special report, which focuses on how the technology continues to evolve and affect our lives.

If you had about 180,000 hours of underwater recordings from the Pacific Ocean, and you needed to know when and where, in all those different hours, humpback whales were singing, would you Google it?

That is what Ann Allen, a research ecologist at the National Oceanic and Atmospheric Administration, did. Sort of.

In January 2018, she approached Google and asked if they might be able to help her find the signal of humpback whale songs amid all the other ocean noise, like dolphin calls or ship engines. Using 10 hours of annotated data, in which the whale songs and other noises were identified, Google engineers trained a neural network to detect the songs, based on a model for recognizing sounds in YouTube videos, said Julie Cattiau, a product manager at Google.