Read: The randomness of language evolution

The basic problem of “efficiency,” in linguistics, starts with the trade-off between effort and communication. It takes a certain amount of coordination, and burns a certain number of calories, to make noises come out of your mouth in an intelligible way. And those noises can be more or less informative to a listener, based on how predictable they are. If you and I are discussing dinosaurs, you wouldn’t be surprised to hear me rattle off the names of my favorite species. But if a stranger walks up to you on the street and announces, “Diplodocus!” it’s unexpected. It narrows the scope of possible conversation topics greatly and is therefore highly informative.

Informativity in linguistics is usually calculated per syllable, and it’s measured in bits, just like computer files. The concept can be rather slippery when you’re talking about talking, but essentially, a bit of linguistic information is the amount of information that reduces uncertainty by half. In other words, if I utter a syllable, and that utterance narrows down the set of things I could be talking about from everything in the world to only half the things in the world, that syllable carries one bit of information.

In the new study, the authors calculated the average information density—that is, bits per syllable—of a set of 17 Eurasian languages and compared it with the average speech rate, in syllables per second, of 10 speakers for each language. They found that the rate of information transferred stayed constant—at about 39.15 bits per second, to be exact.

François Pellegrino, the senior author of the new study, says linguists aren’t likely to be surprised to learn that there’s a trade-off between speech rate and information density: “It just confirms what the intuition would be.” But what’s special about his and his team’s work is that, for the first time, they were able “to prove that it holds” for this set of languages.

The speed-efficiency trade-off is likely to be fodder for a long-standing debate among linguists about what language is, and what it’s for. “One of the big divisions in the field of linguistics right now is whether it’s useful to think about language as a code for communication or whether it’s more useful to think about language as something like a mathematical language,” says Richard Futrell, an assistant language-science professor at UC Irvine. This controversy is often described as a fight over whether linguistic universals exist: The language-as-math camp, a.k.a. the generativists, follow in the footsteps of Noam Chomsky and think that certain grammatical rules apply to all languages, while the language-as-communication camp, a.k.a. the functionalists, think that’s bunk.

Read: How ‘F’ sounds might break a fundamental rule of linguistics