PREVIOUS DOLPHIN LANGUAGE RESEARCH

Phoenix and Akeakamai

From the 1970s through the mid-2000s, a group of researchers at the former Kewalo Basin Marine Mammal Laboratory in Honolulu, Hawaii, and led by Dr. Louis Herman, conducted experiments using two different types of artificial languages to test the bottlenosed dolphins’ abilities to understand syntax.







One of the languages was based on artificially generated whistle sounds and was taught to a dolphin called Phoenix in the early 1980s. This language allowed the dolphin to “talk back” in principle, but was quickly abandoned after only a few years. The other language was based on gestural signs (therefore being a one-way communication scheme) and was taught to another dolphin called Akeakamai (Hawaiian for “lover of wisdom”). The two languages used different word order to investigate how this would affect the dolphins’ ability to acquire the languages. The whistle sound language that was taught to Phoenix used a left-to-right word order, while the gestural language used the inverse. Both languages included words representing agents, objects, object modifiers, actions, and conjugations that were recombinable, using sentences from two to five words in length.



The word order was shown to be understood by testing the dolphins with so-called semantically reversible sentences. These are sentences for which the subjects and objects cannot be deduced by meaning alone, but only by the use of syntactic knowledge. For example, in a sentence such as: “The cat chased the mouse”, word knowledge is helping us to infer who is likely to chase whom and therefore would not qualify as a semantically reversible sentence. The dolphins were presented with, and acted correctly on sentences such as “Surfboard person fetch” (take the person to the surfboard) as well as “Person surfboard fetch” (take the surfboard to the person).







The dolphins also responded correctly the first time they were exposed to new sentences such as: “Pipe left frisbee fetch”, on the basis of previous understanding of the words, and their relationships in a command structure. Akeakamai, the dolphin that was taught the sign language, also responded correctly to commands given to her on a TV monitor viewed through an underwater window, already the first time with no previous experience of TV. Even when the monitor would simply show white gloved hand signs in front of black clothing (without the person and arms being visible), she would still get it right away.







Also gestural signs for abstract concepts – such as left, right, absent (the concept of zero or “nothing”), creative (“come up with something new”), and tandem (“do this together in synchrony”) – were understandable to these dolphins. In fact, a combination of two of these signs, “tandem creative”, retain their mystery to this day, as every time, the two dolphins responded by coming up with a completely new behavior (such as, doing a back-flip somersault in air and spitting out water at the highest point) and doing this in synchrony with each other. This is at least suggestive of the possibility that they were able to communicate complex information to each other and reach consensus in a short time, just prior to showing the new behavior.