Babies link the sound of a word with the image of an object in their early learning of language and this is an important ability. How do they come to have this mechanism? Are there predispositions to making links between sounds and images?
Research by Asano and others (citation below) shows one type of link. They show that sound symbolism can be used by infants about to learn language (about 11 months) to match certain pseudo-words to drawings - “moma” to rounded shapes and “kipi” to sharply angled shapes. Sound symbolism is interesting but it need not be the first or most important link between auditory and visual information. It seems to me that a 11 month old child would associate barks with dogs, twitters with bird, honks and engine noises with cars, and so on. They even mimic sounds to identify an object. It is clear that objects are recognized by their feel, smell, and sound as well as by sight. The ability to derive meaning from sound is completely natural, as is deriving it from sight. What is important is not the linking of sound and sight with the same meaning/object – mammals without language have this ability.
What is important about sound symbolism is that it is arbitrary and abstract. We appear to be born with certain connections of phonemes and meanings ready to be used. These sorts of connections would be a great help to a child grasping the nature of language as opposed to natural sounds.
Here is the abstract: “A fundamental question in language development is how infants start to assign meaning to words. Here, using three Electroencephalogram (EEG)-based measures of brain activity, we establish that preverbal 11-month-old infants are sensitive to the non-arbitrary correspondences between language sounds and concepts, that is, to sound symbolism. In each trial, infant participants were presented with a visual stimulus (e.g., a round shape) fol lowed by a novel spoken word that either sound-symbolically matched (“moma”) or mis matched (“kipi”) the shape. Amplitude increase in the gamma band showed perceptual integration of visual and auditory stimuli in the match condition within 300 msec of word onset. Furthermore, phase synchronization between electrodes at around 400 msec revealed intensified large-scale, left-hemispheric communication between brain regions in the mismatch condition as compared to the match condition, indicating heightened processing effort when integration was more demanding. Finally, event-related brain potentials showed an increased adult-like N400 response - an index of semantic integration difficulty - in the mismatch as compared to the match condition. Together, these findings suggest that 11-month-old infants spontaneously map auditory language onto visual experience by recruiting a cross-modal perceptual processing system and a nascent semantic network within the first year of life. ”
Asano, M., Imai, M., Kita, S., Kitajo, K., Okada, H., & Thierry, G. (2015). Sound symbolism scaffolds language development in preverbal infants Cortex, 63, 196-205 DOI: 10.1016/j.cortex.2014.08.025