Japanese infants determine where words start and finish by employing a neural circuit attuned to speech sounds
Published online 25 May 2017
To newborns, adult speech sounds like a constant stream of babble. But at some point between their first squeals and first steps, babies begin to recognize the beginning and end of words -- an important development in the process of language acquisition known as word segmentation. Keio University researchers have found that Japanese babies can segment words at seven months -- two months earlier than previously thought -- and have clearly identified the brain regions that help infants do this1.
Previous studies into the neural basis of word segmentation have used a technique known as event-related potentials (ERP), which can measure electrical activity in the brain resulting from specific stimuli, but usually offers only a crude visualization of brain activity. In contrast, Yasuyo Minagawa and colleagues at Keio used functional near-infrared spectroscopy (fNIRS), which measures the concentration of hemoglobin associated with neuronal behavior, to reveal the exact brain regions engaged in word segmentation.
The researchers separated a cohort of 54 Japanese infants into three age-groups (5-6 months, 7-8 months, and 9-10 months) and introduced them to simple Japanese words, such as tanishi (mud snail) and zakuro (pomegranate).
In an initial training session, the researchers attracted the infants' attention to a screen, where single target words were read aloud together with a lively animation. This was followed by behavioral testing, in which the target words and control words were recited as part of a sentence.
Infants aged 9-10 months spent significantly more time looking at the screen when sentences included the target words, indicating their ability to segment the words from the sentences.
Neuroimaging using fNIRS, however, indicated that this skill develops slightly earlier. During the training task, infants in the 7-8-month and 9-10-month age-groups showed significant activation in brain regions that play a role in encoding and storing short-term phonetic memory. The test session, in contrast, significantly activated regions associated with retrieving the phonetic memory.
These results corroborate with previous studies using ERP, but reveal for the first time the exact regions that play a role in infant word segmentation. These regions seem to be a part of the early cerebral circuit involved in articulating sounds (dorsal pathway) as opposed to the circuit that registers meaning (ventral pathway).
Furthermore, the study pinpoints when Japanese babies begin to segment words in their native language, which is around the same time as babies learning English and German, but slightly earlier than French.
Minagawa hopes to further study how Japanese babies learn the meaning of words; and to explore the role of social interaction in language acquisition.