Aaron Fernandez

By creating tech that lets dolphins play computer games and request belly rubs, we can understand their intelligence and perhaps even get a preview of life on other planets, says marine mammal researcher Diana Reiss.

In some ways, dolphins and humans aren’t so different. We’re both intelligent, social animals, and we both rely on complex vocal signals to convey information. However, while we have recognized for decades that dolphins possess their own distinct language, we still don’t understand what they are saying. Now Hunter College cognitive psychologist and marine mammal scientist Diana Reiss (TED Talk: The interspecies Internet?) is collaborating with Marcelo Magnasco at Rockefeller University and other researchers to develop a giant touchscreen that could start to decode their communication.

Dolphins communicate with sounds — a vast array of them. Besides producing clicks for echolocation, dolphins also make pulsing sounds, squawks, brays, pops and low grunts (called “thunks” by scientists), as well as a range of complicated whistles. Each dolphin has a contact call, a unique whistle that each dolphin uses to identify itself, and “there are a rich variety of other calls,” says Reiss, which remain undecoded by humans. But here’s one problem in studying the aquatic animals: When recording them, it’s extremely difficult to track which dolphins are making which sounds — which makes it difficult to match a call to an activity and understand what it means. As of now, according to Reiss, scientists are only able to identify the general kinds of sounds that dolphins make when engaged in a certain activity, like foraging or play. “It would be like saying, ‘these are the kinds of sounds humans make when they’re socializing,’” she says. “It hasn’t advanced us that much.”

In the 1980s, Reiss got a tantalizing glimpse into how technology could bridge the communication gap. Just as the PC was becoming more available, she and her colleagues built a dolphin-friendly version of sorts: a 9-by-9-foot electronic keyboard that could be lowered into a pool. Each key bore a unique symbol. When dolphins pressed a given key with their beaks, the signal traveled via fiber-optic cable to an Apple II computer, which was programmed to generate a dolphinesque whistle. These sounds were in the same frequency range as dolphin whistles but were unique, so that the dolphins could use the keyboard to make requests from their human handlers: for toys, such as a ball or ring, or for hands-on attention, like a belly rub. When Reiss listened to recordings of the pool made via underwater microphones, she heard the dolphins mimicking the Apple II’s whistles on their own, and sometimes even combining them with their own contact calls. Reiss was thrilled and curious — could the keyboard serve as a Rosetta Stone for understanding how dolphins learn new sounds and use them? However, the keyboard’s functionality was limited, and the camera and microphone technology that could match recorded sounds to individual dolphins didn’t exist.

Now, Reiss and a group of biophysicists have brought this idea into the 21st century with a dolphin touchscreen. The team has built a 4-by-8-foot window into the wall of a pool at the National Aquarium in Baltimore. Because immersing an actual touchscreen in water would be dangerous, Rockefeller postdoctoral researcher Ana Hočevar Brezavšček developed a smart solution: A projector on the researchers’ side casts interactive programs onto a screen that dolphins can see, and optical sensing technology detects when a dolphin touches the screen.

Could dolphins use it, too? To find out, the team created a dolphin-friendly version of whack-a-mole, in which virtual fish swim across the screen and vanish when they’re touched. They tried it out with an adolescent, 10-year-old dolphin named Foster. Within seconds of the screen turning on, Foster “came up to it, stopped on a dime and started touching the fish,” Reiss reports. When the fish disappeared at touches from his melon, or forehead, the game was “self-reinforcing,” she says. “He just got it.” In another session, he began tapping the virtual fish with his beak. His behavior was surprising, given that Foster — born and raised at the aquarium where he is fed dead, defrosted fish — has never seen a live fish before. Reiss says it’s too early to speculate why he grasped the game so immediately.

Now Reiss is reviving her 1980s experiment. Heartened by these successful sessions with Foster, the team has paused their experiments so they can develop an app, similar to the keyboard that Reiss used decades earlier, in which whistles will be associated with symbols and objects. But this time, the team also has a complex array of microphones and cameras to deploy. Installing this system with such intelligent creatures has had its challenges. “Dolphins are curious, and they will try to break things,” says Rockefeller biophysics PhD student Sean Woodward. He designed a system of four acrylic panels, each with four microphones embedded in it, to be placed along the walls of the dolphin pool to track sound in different locations. So far, he says, the dolphins have been rubbing themselves on the panels to scratch their itches, but the equipment is still intact. He is also installing 11 cameras to track the locations of the dolphins (multiple cameras are needed to correct for refraction in the water). By combining the audio and visual data, the team will be able to trace each sound back to a particular point in the pool and to a specific dolphin, creating a running transcript of exactly who is saying what and when. Data-mining algorithms will help sift through this information and look for patterns. It’s possible we’ll soon start to understand how the dolphins learn symbols and sounds and how their sounds correspond to particular behaviors. With such fine-grained data, Reiss is hopeful they’ll make significant progress in understanding dolphin communication.

The key, Reiss says, is not to train the dolphins, but rather to give them “choice and control” in how they learn. “Dolphins are really our partners on this,” she says. Observing their reactions, on their own terms, to new stimuli via the touchscreen’s apps should illuminate the nature of their intelligence. And because dolphins are both highly intelligent and extremely different from us, figuring out how their minds work could provide us important insights into how intelligence evolves and manifests given different environments, inputs and biologies. We’ll be able to see more clearly, Reiss says, “what’s unique in us and what we share with other animals.”

Understanding their communication could even help us as we explore other planets. Like potential extraterrestrial life forms, dolphins are intelligent non-humans who evolved to thrive in a vastly different environment from us. “Decoding dolphin language could be a practice ground for aliens,” says Reiss.