Lost languages leave traces on the brain

When we can hear a language for the first time, our brains start taking in details about it. One of the first things babies learn about their native language is how consonants and vowels work, as well as other aspects of speech, sounds like a pitch. In the first year of life, a baby’s ear tunes in to the particular sounds it hears around it, and its brain learns to tell the small differences between them. This foundation will help the child learn words and grammar in the future.

But what happens if that child is moved to a different culture after learning its first language? Does it forget everything it knew in its first language, or does it still remember some of it?

A new PNAS paper says that the effects of learning a language when you are very young are permanent, even if you stop hearing that language and start hearing another one instead. Researchers used functional magnetic resonance imaging (fMRI) scans on adopted children to look for brain patterns years after adoption. This helped them figure out what effect this had on them.

Since not all parts of language have clear effects on the brain, the researchers chose to look at the lexical tone. This feature of some languages lets the same set of consonants and vowels mean different things depending on how high or low the pitch is. For example, in Mandarin Chinese, the word “ma” means “hemp” when the tone goes up and “scold” when the tone goes down.
A certain part of the brain’s left hemisphere works differently in people who speak tone languages. This part of the brain is activated by differences in pitch that are used to show differences in linguistic meaning. The right hemisphere is responsible for processing pitch that has nothing to do with language. Tone information is learned very early on. Four-month-old babies learning Chinese languages like Mandarin and Cantonese can already tell the difference between tones.

The researchers looked at 21 Chinese kids who were adopted when they were young. The average age of the children when they were adopted was 12.8 months, so they probably knew how to recognize tone before they were adopted. Since they were adopted, the children had only heard French. They had grown up speaking only French and didn’t remember anything about Chinese.

The researchers used 11 children who only spoke French and 12 who spoke Chinese and French as controls. The kids, who were all 9 and 17 years old, did a task that involved telling the difference between different tones while in the fMRI scanner. They heard pairs of phrases made up of nonsense words with Chinese speech sounds (like “brillig” or “string”), or they hummed phrases that told them nothing but the tone. The last syllable of each pair of words was either the same or different. The kids were asked to push a button to show whether the last syllable was the same or different.

All of the kids could answer with a very high level of accuracy, and there were no differences in accuracy or reaction times between the groups. But their fMRI scans showed that they processed the information in different ways.
Children who spoke both Chinese and French used the part of the left hemisphere of their brains that speakers of tone languages use. In contrast, French speakers who only spoke French used their right hemispheres, as they would for any complex sound. The brains of adopted children who used to speak Chinese showed the same pattern as the brains of Chinese-speaking bilinguals: activity in the left-hemisphere tone area.

The activation was also stronger in kids who were older when they were adopted. The researchers think this shows that the brain’s representation of lexical tone gets stronger as it hears it more. But the amount of activation in the brain didn’t change based on how long it had been since the children were adopted. This could mean that once a tone is represented in the brain, time doesn’t weaken or erase it.

Dr. Cristina Dye, who studies how children learn a language, says that this study is especially useful because the lexical tone is a great way to look into this question. In the past, studies that looked at the same question used tasks that required more complex language skills, which young children are less likely to have learned. The lexical tone is also very hard for adults to learn, meaning traces of it are most likely from when they were young.

Like many fMRI studies, this one has a small number of people to study. This is because the technology is expensive, and there are strict rules about who can join. Dye says that the results back up studies of behavior that have found similar signs of lost languages.

The next thing the researchers want to find out, they write, is if the neural traces of the first forgotten language can change how the brain learns or processes other languages. There may also be implications for learning lost languages. For example, people who have forgotten how to speak a language may be able to learn it faster or better than people who have never spoken it.

Related Stories

Leave a Reply