ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. or Join now .
Sign up for the free Nautilus newsletter:
science and culture for people who love beautiful writing.
NL – Article speedbump

Different arrangements of words can be likened to microstates in statistical mechanics—the total set of ways a system’s constituent particles can be configured.Photo illustration by Khomich Yauheni / Shutterstock

Nautilus Members enjoy an ad-free experience. Log in or Join now .

Like all new parents, I must sound like a kook when I babble along with my 9-month-old daughter. That’s okay: It delights her. I sometimes ask her what she might mean as she offers some apparently affirming utterance and looks at me with her big blue eyes: Oh, you like it when daddy lifts you? Ah, you’re thankful for a new diaper? Her vocalizations—the squeals and whoas and yah-wahs—can have surprising verve and a kind of ecological significance. My daughter’s noises, scientists say, “catalyze” me to produce “simplified, more easily learnable language.” 

My daughter is now regularly saying “mama.” My wife has taken this to mean the baby wants her attention. So I felt a little disheartened after reading a 2017 paper in Neuroscience & Biobehavioral Reviews, titled “The growth of language: Universal Grammar, experience, and principles of computation.” Charles Yang, a computer scientist at the University of Pennsylvania who studies language acquisition, wrote, along with colleagues, that, “Despite the prevalence of sounds such as ‘mama’ and ‘dada,’ the combination of consonants and vowels in babbling have no referential meaning”—they are nothing more than “rhythmic repetitions of nonsense syllables.” (Sorry, Danielle!) This lack of “semantic content” is nothing to be bummed about, though, for “babbling merges linguistic units (phonemes and syllables) to create combinatorial structures,” the bread and butter of language learning.

“Each grammar defines probabilities for sentences,” says theoretical physicist Eric DeGiuli.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

The puzzle of how language can be so learnable has had me, of late, utterly perplexed. Babies are helpless. I can see how, with some instruction, a child can pick up on a new idea or skill, like shoe-tying—they can understand what you’re saying and mimic movements. Yet my daughter will be obeying grammar reliably and effortlessly, if a bit crudely, before she’s tying her own shoes. It’s a wonder babies learn anything, let alone a language. Parrots, great speech mimickers that they are, can’t babble, or break speech down into its discrete units. Yang and his colleagues note that the “best that Alex, the famous parrot, could offer was a rendition of the word ‘spool’ as ‘swool,’ which he evidently picked up when another parrot was taught to label the object.” 

Parrots don’t, in other words, partake in the phenomenon of Merge, the engine of hierarchical linguistic structures, a mental technology which starts to rev up when infants start to vocalize. All human languages have an “inaudible and invisible hierarchical structure” that, when we’re children, we impose on sound sequences we hear, writes linguist David Adger in his Nautilus feature, “This Simple Structure Unites All Human Languages.” “We hear sounds or see signs, but our minds think syntax.” Which is why infants can learn any language. There’s something deeply hidden in the mind that recognizes syntax—or the peculiar grammar of each language—as if it were as apparent as a face. “The full scale of linguistic complexity in a toddler’s grammar still eludes linguistic scientists and engineers alike, despite decades of intensive research.” 

Perhaps it’s no surprise, then, that physicists are now chiming in. Take a recent paper from Eric DeGiuli, a theoretical physicist, formerly at École Normale Supérieure, in Paris, now at Ryerson University in Toronto. In his paper, he argues that a language—or specifically a “context-free grammar,” which almost all languages share—can be treated as a physical object that children interact with as they hear words and sentences. In a context-free grammar, sentences have tree-like graphical structures: “The bear walked into the cave” can be broken down into a noun and verb phrase—“the bear” and “walked into the cave,” respectively—and each of those elements breaks down into more elementary units, like nouns and verbs. You might call these the tree’s leaves.

This gave me the absurd image of a baby flying through a forest. The “surface” of a language that babies “touch” are all the possible configurations of words into sentences, both meaningful and meaningless. “Each grammar defines probabilities for sentences,” DeGiuli writes. In English, certain words, coming one after another, are more likely than others to make grammatical sense. DeGiuli says that the ones that do make sense get assigned, in the baby’s mind, greater weight, while word combinations that don’t make sense get assigned less weight. These different arrangements of words “are like the microstates in statistical mechanics—the set of all possible arrangements of a system’s constituent particles,” science writer Philip Ball wrote, in an article on DeGiuli’s paper. “The question is, among all possible [context-free grammars], what kind of weight distributions distinguish [context-free grammars] that produce random-word sentences from those that produce information-rich ones?” 

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

Surprisingly, it has to do with “temperature.” Not literal temperature, as in hot and cold, but a measure akin to temperature. For DeGiuli, hot is associated with randomness while cold is more deterministic, not unlike how high temperature in a room or object is due to the quick and chaotic movement of atoms. He writes that context-free grammars have “two natural ‘temperature’ scales that control grammar complexity, one at the surface interface, and another in the tree interior.” As Ball explains:

DeGiuli’s theoretical analysis—which uses techniques from statistical mechanics—shows that there are two key factors involved: how much the weightings “prune” branches deep within the hierarchical tree, and how much they do so at the surface (where specific sentences appear). In both cases, this sparseness of branches plays a role analogous to a temperature in statistical mechanics. Lowering the temperature both at the surface and in the interior means reducing more of the weights. As the deep temperature is lowered—meaning the tree interior becomes sparser—DeGiuli sees an abrupt switch from [context-free grammars] that are random and disorderly to ones that have high information content. This switch is a phase transition analogous to the freezing of water. He thinks that something like this switch may explain why, at a certain stage of development, a child learns very quickly how to construct grammatical sentences.

The idea that language locks in place, as water does when it’s chilled, is a weird, attractive idea. It’s fair to say that something has “crystallized” in the mind of child able to talk. How that exactly happens, though, as in so much in the natural world, on which theoretical physics turns its gaze, is difficult to say. As I play with my daughter and listen to her form proto-words, I sympathize with Yang and his colleagues: “The growth of language is nothing short of a miracle.”

Brian Gallagher is the editor of Facts So Romantic, the Nautilus blogFollow him on Twitter @BSGallagher.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .
close-icon Enjoy unlimited Nautilus articles, ad-free, for less than $5/month. Join now

! There is not an active subscription associated with that email address.

Join to continue reading.

You’ve read your 2 free articles this month. Access unlimited ad-free stories, including this one, by becoming a Nautilus member — 25% off for a limited time during our seasonal sale.

! There is not an active subscription associated with that email address.

This is your last free article.

Don’t limit your curiosity. Access unlimited ad-free stories like this one, and support independent journalism, by becoming a Nautilus member — 25% off for a limited time during our seasonal sale.