Macaque monkeys can be trained to produce complex spatial sequences beyond the simplest levels of grammar previously known from animal studies. This indicates cognitive capabilities in the spatial-motor domain that approach the computational complexity level of human syntax.

Main Text

1 Fitch W.T. The Evolution of Language. 2 Stoeger A.S.

Mietchen D.

Oh S.

de Silva S.

Herbst C.T.

Kwon S.

Fitch W.T. An Asian elephant imitates human speech. 3 Knörnschild M. Vocal production learning in bats. 4 Nowicki S.

Searcy W.A. The evolution of vocal learning. The human capacity for language, allowing us to express any thought we can think, appears to be unique on the planet: although most animals communicate, none but humans show this unbounded expressive power. But our capacity to use and acquire language consists of multiple interlocking subcomponents, many of which are shared with other species []. For example, humans are the only living primate species known to be capable of learning and reproducing novel vocalizations, including words or melodies: but we share this capability with other, more distantly related species including birds, bats, seals, or elephants []. Shared traits are a boon to biologists interested in language, as animal models allow us to deploy a panoply of neuroscientific tools to understand their inner workings, and to test evolutionary hypotheses about adaptive function.

5 Fitch W.T.

Friederici A.D. Artificial grammar learning meets formal language theory: An overview. 6 ten Cate C. Assessing the uniqueness of language: Animal grammatical abilities take center stage. 7 Jiang X.

Long T.

Cao W.

Li J.

Dehaene S.

Wang L. Production of supra-regular spatial sequences by macaque monkeys. One component of language has until now resisted the search for parallels in our animal brethren: syntax. All human languages have at their core complex sets of rules which enable us to combine phonemes into syllables into words into sentences with precise, specific meanings. The last, most complex stage (words into sentences) requires rule systems — ‘grammars’ — previously thought to be beyond the capabilities of nonhuman animals []. In this issue of Current Biology, Jiang et al. [] show that, with adequate training, monkeys can break beyond this barrier.

7 Jiang X.

Long T.

Cao W.

Li J.

Dehaene S.

Wang L. Production of supra-regular spatial sequences by macaque monkeys. Figure 1 Macaques can master supra-regular grammars. Show full caption 7 Jiang X.

Long T.

Cao W.

Li J.

Dehaene S.

Wang L. Production of supra-regular spatial sequences by macaque monkeys. (A) The formal language hierarchy categorizes computational systems at different levels of complexity. Each small circle is a special case of the outer, enclosing circles. The outermost circle of all represents anything that is Turing-machine computable. Until now, high performance on supra-regular systems was known only for humans. (B) A rhesus macaque working on a touchscreen illustrates the basic paradigm developed by Jiang et al., where a circular array of screen locations is used to encode grammars at varying levels of complexity. The monkeys in this study succeeded in mastering two complex grammars, at the supra-regular level, a first for nonhuman animals. Working in Liping Wang’s laboratory, together with Stanislas Dehaene, Jiang and colleagues [] trained two rhesus macaques to produce structured sequences by pressing a touchscreen at specific locations arrayed around a circle. After intensive training, the monkeys could learn rules more complex than any previously demonstrated in nonhuman species. Most tellingly, they learned to produce mirror sequences following the pattern ABC|CBA, and generalized this ability to new sequence lengths. This is important because such ‘mirror grammars’ require computational capabilities beyond the simplest type, the so-called ‘regular’ or ‘finite state’ grammars. Like the grammars of all human languages, mirror grammars require a learner to possess ‘supra-regular’ computational abilities, which requires specific computational machinery not needed at the lower sub-regular level ( Figure 1 ). The new results thus suggest that the monkey’s brain possesses the kind of cognitive mechanisms required for human linguistic syntax, at least in this specific cognitive domain, and after intensive training.

8 Turing A.M. On computable numbers, with an application to the Entscheidungsproblem. 9 Jäger G.

Rogers J. Formal language theory: refining the Chomsky hierarchy. 10 Hopcroft J.E.

Motwani R.

Ullman J.D. Introduction to Automata Theory, Languages and Computation. To fully unpack the significance of this result requires a bit of computational theory. Formal language theory is a branch of mathematics, originating with the work of Alan Turing [], that plays a central role in theoretical computer science []. It specifies the types of computational mechanisms required to cope with potentially infinite sets of strings — termed ‘languages’ — that obey certain constraints or follow certain patterns []. Note that, despite using such words as ‘grammar’ and ‘language’, this body of theory is not limited to human languages: it applies across diverse domains including mathematical expressions, music, visual patterns, or even well-formed telephone numbers. Any system where some strings are valid (‘grammatical’ or well-formed) and others invalid can be analysed using formal language theory, which tells us what type of computational system is needed to identify or generate that particular set of well-formed strings.

9 Jäger G.

Rogers J. Formal language theory: refining the Chomsky hierarchy. The most limited class consists of the regular languages, which require computational machinery termed ‘finite state automata’. Such systems have a limited ‘rote’ memory, but limited ability to keep track of past occurrences or context. Abundant previous work shows that many nonhuman species possess, at least, this level of computational power. But human language has many examples where such limited systems are inadequate: for example a sentence with an ‘if’ will typically have, some arbitrary number of words later, the word ‘then’, and such if/then pairs can be nested within one another. This is one of many examples in linguistic syntax where complex tree structures are required to capture the empirical facts about human language, and crucially, supra-regular systems are needed to capture such patterns [].

11 Fitch W.T.

Hauser M.D. Computational constraints on syntactic processing in a nonhuman primate. nBn (meaning ‘some number of As followed by the same number of Bs’). This suggested that supra-regularity may represent a threshold between humans and other species. However, it used a habituation paradigm involving very little training. Later work using intensive training in songbirds appeared to show success on this same grammar [ 12 Gentner T.Q.

Fenn K.M.

Margoliash D.

Nusbaum H.C. Recursive syntactic pattern learning by songbirds. 7 Jiang X.

Long T.

Cao W.

Li J.

Dehaene S.

Wang L. Production of supra-regular spatial sequences by macaque monkeys. 13 Ravignani A.

Westphal-Fitch G.

Aust U.

Schlumpp M.

Fitch W.T. More than one way to see it: Individual heuristics in avian visual cognition. 14 van Heijningen C.A.A.

de Vissera J.

Zuidema W.

ten Cate C. Simple rules can explain discrimination of putative recursive syntactic structures by a songbird species. nBn grammar underlying most of this previous work has also been faulted for allowing strategies such as counting which have little relevance for the structural analyses at the heart of human languages. Previous animal work on supra-regularity started with the finding of [] that cotton-top tamarins, while able to learn a simple regular rule (‘repeat AB indefinitely’) were unable to learn a closely matched supra-regular grammar termed A(meaning ‘some number of As followed by the same number of Bs’). This suggested that supra-regularity may represent a threshold between humans and other species. However, it used a habituation paradigm involving very little training. Later work using intensive training in songbirds appeared to show success on this same grammar [] but, along with several subsequent studies [], has been faulted on methodological grounds. The central empirical challenge is to exclude the possibility that animals ‘succeed’ on a supra-regular task by inferring ‘shortcuts’ or heuristics at the finite-state level; animals have indeed been demonstrated in several studies to adopt such alternative strategies []. Although supra-regular, the Agrammar underlying most of this previous work has also been faulted for allowing strategies such as counting which have little relevance for the structural analyses at the heart of human languages.

7 Jiang X.

Long T.

Cao W.

Li J.

Dehaene S.

Wang L. Production of supra-regular spatial sequences by macaque monkeys. The mirror grammar used by Jiang et al. [] neatly avoids this problem, as simple counting or various finite-state shortcuts will not yield success. This grammar, combined with the novel experimental paradigm, makes the new work a paragon of how to perform this type of animal research. In addition to the mirror grammar, the monkeys in this study learned two additional grammars, including a ‘repeat’ or ‘copy’ grammar (requiring similar supra-regular resources to the mirror grammar). One monkey was able, impressively, to combine a spatial rule (‘progress around the circle’) with the mirror repetition, so that given ‘A’ it could produce the subsequent ‘BC’, followed by the entire ‘CBA’ mirrored sequence.

Finally, to put these achievements into perspective, the researchers tested pre-schoolers aged 5–6 years. Human children learned the tasks easily and almost instantly (requiring about five demonstrations), and vastly out-performed the monkeys.

15 Miller G.A. Project Grammarama. 16 Fitch W.T. Toward a computational framework for cognitive biology: unifying approaches from cognitive neuroscience and comparative cognition. This monkey/pre-schooler comparison suggests two rather different interpretations of the overall results. On the one hand, the new results are bad news for those who want to draw a strict line separating humans from other animals. On the other hand, the fact that monkeys require intensive training (tens of thousands of trials) to successfully learn the task, while children master it nearly perfectly with almost no training, suggests a major quantitative distinction between humans and macaques in this cognitive domain. While humans may not be the only species capable of mastering supra-regular systems, we may have an unusually strong propensity to do so []. I have previously dubbed this human propensity to infer tree structures ‘dendrophilia’ []; the new work suggests that macaques may be dendro-competent with training, but not dendrophilic by nature.

17 Rilling J.K.

Glasser M.F.

Preuss T.M.

Ma X.

Zhao T.

Hu X.

Behrens T.E.J. The evolution of the arcuate fasciculus revealed with comparative DTI. 18 Schenker N.M.

Hopkins W.D.

Spocter M.A.

Garrison A.R.

Stimpson C.D.

Erwin J.M.

Hof P.R.

Sherwood C.C. Broca's area homologue in chimpanzees (Pan troglodytes): probabilistic mapping, asymmetry and comparison to humans. 19 Fitch W.T. What animals can teach us about human language: The phonological continuity hypothesis. Future research can proceed on two major pathways. The first concerns the generality of macaque’s capabilities: are they limited to this particular visuospatial domain and manual output modality, or can macaques generalize to other systems (such as auditory sequences)? If supra-regularity turns out to be domain specific, it may be that spatial cognition and/or motor control provided the original evolutionary function of supra-regularity, later exploited and strengthened during human evolution. Second, now that a task exists where monkeys successfully exhibit supra-regular abilities, neuroscientists can begin a full exploration of the neural mechanisms involved (for example, with intracranial recording, functional magnetic resonance imaging, and so on). Such research should greatly illuminate the brain mechanisms underlying human language. My money personally is on involvement of the region of the prefrontal cortex known as Broca’s area, which exists in monkeys but is greatly expanded in humans []. It is possible that the massive expansion of this region in humans, combined with greatly increased connectivity, underlies the domain-general dendrophilia typifying our species [].

In conclusion, after more than a decade of searching, a paradigm allowing a nonhuman species to break through the ‘syntax barrier’ has finally been found. While many other mechanisms are needed for full human language (including both complex semantics and the neural control required for speech), one central component of syntax can now be explored at the neural level. Be sure to watch this space!