The Biology of . . .Baby Talk

Why scientists go gaga over infants' goo-goos

By Mary Duenwald
Dec 3, 2003 6:00 AMNov 12, 2019 4:52 AM

Newsletter

Sign up for our email newsletter for the latest science news
 

Rebecca looks bizarre with brain-imaging gizmos attached to her little bald head—like a baby who has crept into Dr. Frankenstein’s lab. A white terry-cloth headband holds two plastic squares against either side of her skull. Each contains a set of black rods with a welter of wires. Rebecca seems oblivious to the headgear as she turns her head from side to side with a wet, toothless grin. She isn’t yet five months old, but according to Laura-Ann Petitto, a cognitive neuroscientist at Dartmouth College, she is already using the parts of her brain involved in language. And the contraption on her head is designed to let Petitto watch her do it.

Photograph by Dirk Anschütz

At Dartmouth College, laser light is beamed into a baby's head. Researchers analyze reflections from the laser for signs of brain activity during babbling.

Known as near-infrared spectroscopy, this technology is designed to show which part of the brain governs a given behavior by measuring where the brain uses the most oxygen. Petitto is learning how to use the device, and in time she hopes to zero in on an area just above the left ear that may play a prominent role in language acquisition. “Language is the looming contributor to this thing we call consciousness, which is at the heart of reason, emotion—the individual,” she says. “Think about what we’re doing right now. I’m sending sound waves through the air. I’m not even touching you. Yet you have explosions of meaning in your head. By what mechanism does our species accomplish this truly astounding feat?”

For a scientist trying to answer this question, babies are the ultimate black box. They can’t explain a word of what’s happening inside their small developing brains, yet that’s where language—with all its complexities of grammar and vocabulary—is born. “You wouldn’t expect babies to be better than adults at anything,” says Jenny Saffran, director of the Infant Learning Laboratory at the University of Wisconsin at Madison, “but they are better at learning language.”

Babbling—the stringing together of repetitive syllables, as in da, da, da, da, da or ga, ga, ga, ga, ga—is one of the earliest stages of language acquisition. Babbling allows babies to learn and practice sounds they will one day use to create language. And so scientists have listened to babies babble, and they have watched babies babble. And if this new spectroscopy lives up to its promise, they may soon be able to watch babies’ brains operate as they babble.

Babbling is universal. No matter where babies are born or to which language they’re exposed, they begin—between 5 and 10 months of age—to rhythmically repeat syllables. Coincidentally, they often accompany themselves with equally rhythmic movements of their hands and feet. Petitto says they’re especially fond of shaking their right hand, or a rattle, while they babble.

“People used to think that language grew from our capacity to produce and hear speech,” Petitto says. “If that were true, then a child who is stripped of speech should learn language in a different way.” In fact, she says, babies can even babble in sign language.

Brain scans show that Broca’s area, located behind the left temple, helps us produce language and understand grammar while Wernicke’s area, just above the left ear, helps receive language and decipher its meaning. It’s still not clear when these language centers develop.

A few years ago, Petitto and her colleagues attached light-emitting diodes to the hands of babies learning to sign and others learning to speak. An electronic device recorded the trajectory, velocity, and frequency of the babies’ hand movements. Both groups, Petitto found, made rhythmic hand gestures with a frequency of about 3 hertz—three complete movements a second. But the babies exposed to sign made a second kind of movement as well, this one with a frequency of 1 hertz, or roughly one second.

The timing is significant because it’s almost equivalent to the length of one unit of spoken babble: da, da, da, da, da. To Petitto, this suggests that language grows from a part of the brain that can work with either sign or sound—one that is wired to register the bursts of aural or visual communication that are the building blocks of words. “A baby finds delicious, and is very powerfully attracted to, anything that has these rhythmic undulations,” she says.

Encouraged by parents and others, babies gradually learn to identify which of the millions of sounds they hear are actually words. They learn, for example, that when they hear someone say “pretty baby,” pretty is a word and baby is a word, but ty-ba is not a word.

Saffran has looked into how babies do this by exposing them to made-up words, such as golabu and daropi, and repeating them over and over. She has found that babies compute, unconsciously, the probabilities that certain sounds will be paired together. “It’s statistical learning,” she says. “They learn how often they hear pre before ty and ba before by.” If the sounds come up together often enough, the babies hear them as distinct words.

Petitto has begun to home in on the part of the brain that controls babbling and the early development of language. In a study reported in Science a year ago, she and her colleagues videotaped the mouths of babbling babies. They found that the babies were opening the right sides of their mouths wider than the left. Given that the left side of the brain controls the right side of the body, this suggests that babbling is mainly a left-brain activity.

Startfish left

Starfish right

Speech left

Speech right

When infants in the Dartmouth study were shown a moving picture of a starfish, the left and right hemispheres of their brains lit up (dark red indicates high brain activity; dark blue, low activity)—but not as much as when researchers spoke to them, saying: “Hello, baby. Are you a good baby?” The left hemisphere, which neuroscientists believe controls early speech production, lit up most brightly.

Images courtesy of Laura-Ann Petito/Dartmouth

Within the left brain, Petitto has her sights set on the planum temporale, a piece of the superior temporal gyrus, which is a chunk of brain about the size and shape of an index finger that curves over the top of the ear. The superior temporal gyrus is known to be part of the broad neural network that adults use in listening to and producing language. In studies of adults, Petitto has found that both hearing and deaf adults use the planum temporale—mainly on the left side—to process syllables, whether signing or speaking aloud.

The beauty of near-infrared spectroscopy is that it enables Petitto to see how babies’ brains operate while they’re awake and learning to talk. MRI scans don’t work because the babies would have to lie perfectly still. Petitto’s machine, made by Hitachi, uses weak infrared light from a laser diode, which shines through the skull and then about an inch farther into the brain. The amount of light reflected back from each region is determined by how much blood and oxygen the brain is using in that area. The more oxygen being used, the harder the brain is working.

As the machine probes Rebecca’s brain, she sits on her mother’s lap and the room goes quiet and dark. On a video screen, a young woman silently holds her right palm up flat like a traffic cop, then rhythmically rotates it—palm, back of the hand, palm, back of the hand—every second and a half.

Rebecca watches for less than a minute before starting to sigh, fidget, and kick her feet. But in that time a computer has recorded how her brain operates. The planum temporale “was clearly the part of the brain that was activated,” Petitto says, and the same was true for the 10 babies who were examined before Rebecca. So far, she says, “the data are gorgeous.”

Petitto wants to scan at least 100 babies before reaching any conclusions. Then she wants to use near-infrared spectroscopy on babies who are in the act of babbling. “I want to crack the code,” she says.

Five Stages of Baby Talk

1 PHONATION (0 to 2 months): Babies make their first sounds other than crying, often without opening their mouths. Example: a staccato hmm, hmm, timed with exhalations.

 2 PRIMITIVE ARTICULATION (1 to 4 months): Babies use their tongue and jaw to form new sounds. Examples: gleh, glechh.

 3 EXPANSION (3 to 8 months): Babies squeal, yell, or whisper, as if exploring the range of sounds, pitch, and amplitude the mouth can manage. Examples: shrieks, growls, Bronx cheers.

 4 BABBLING (5 to 10 months): Babies begin to form their first syllables. Examples: ba, ba, ba, ba or da, da, da, da, da.

 5 SOPHISTICATED BABBLING (9 to 18 months): Babies combine syllables such as ba, da, ga, mix in real words such as dada or mama, and string together meaningless sounds with the rhythm and pacing of a real sentence.

Source: D. Kimbrough Oller, University of Memphis

1 free article left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

1 free articleSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!

Subscribe

Already a subscriber?

Register or Log In

More From Discover
Recommendations From Our Store
Shop Now
Stay Curious
Join
Our List

Sign up for our weekly science updates.

 
Subscribe
To The Magazine

Save up to 40% off the cover price when you subscribe to Discover magazine.

Copyright © 2024 Kalmbach Media Co.