How do people coordinate their actions; how does communication work; how does it affect people; how do minds get in sync? When people communicate they do get in sync but there is no magical about this. We perceive the outside world including signals as well as scenery, we model this input and think about it, we then can act on the basis of that cognition. The pathways are there for the action-perception cycle whether we are alone or engaged socially. The coupling of the brains of two people in communication has not been studied very often because it is difficult. Figuratively, there is usually only one fMRI scanner and it only holds one person at a time. A paper by Hasson (see citation below) highlights this problem. Here is the abstract and the conclusion.
Cognition materializes in an interpersonal space. The emergence of complex behaviors requires the coordination of actions among individuals according to a shared set of rules. Despite the central role of other individuals in shaping one’s mind, most cognitive studies focus on processes that occur within a single individual. We call for a shift from a single-brain to a multi-brain frame of reference. We argue that in many cases the neural processes in one brain are coupled to the neural processes in another brain via the transmission of a signal through the environment. Brain-to-brain coupling constrains and shapes the actions of each individual in a social network, leading to complex joint behaviors that could not have emerged in isolation.
The structure of the shared external environment shapes neural responses and behavior. Some aspects of the environment are determined by the physical environment. Other aspects, however, are determined by a community of individuals, who together establish a shared set of rules (behaviors) that shape and constrain the perception and actions of each member of the group. For example, human
infants undergo a period of perceptual narrowing whereby younger infants can discriminate between social signals from multiple species and cultures, but older infants fine tune their perception following experience with their native social signals. Coupled brains can create new phenomena, including verbal and nonverbal communication systems and interpersonal social institutions, that could not have emerged in species that lack brain-to-brain coupling. Thus, just as the Copernican revolution simplified rather than complicated understanding of the physical world, embracing brain-to-brain coupling as a reference system may simplify understanding of behavior by revealing new forces that operate among individuals and shape one’s social world.
I found several parts of the paper very interesting. First, he makes the point that language seems to be geared to a 3-8 Hz rhythm. That is about 3 to 8 syllables in a second, and can be found in auditory processing of language, in the sound delivery of speakers and the movements of their mouths. As that rhythm, the theta band in the brain, is a constant beat in all of us when we are awake, it does not have to be created but just aligned between the speaker and listener. The listener will mimic the rhythm of the speaker (and when they speak, they will also imitate the sounds, grammar, words and meaning of their partner in conversation).
Second, communication must be learned because it requires shared rules, usage, language, customs and culture. For babies and song birds, their learning is only really successful in a social context of face to face communication. The learner must understand that this is an actually communication with its teacher or the learner does not learn. “The babbling of a 7-12 month-old infant exhibits a pitch, rhythm and even a syllable structure that is similar to the ambient language.” The adult care-giver has to respond to the babbling of the infant and the infant must react to the caregivers responses – it requires actual social interaction.
Third, in fMRI scans of speakers and listeners, there are activities that would not be noticed if only one of the individuals was scanned. There are areas that are in sync between the two scanners. There are areas in the listener that follow the speaker – like an imitating action. And, surprise, there are areas in the listener that lead the speaker – like a prediction. In other words, the process of listening uses the mechanisms of action without the action.
As indicated in the last two postings – it is not clear that an entirely new brain structure was needed for language or for communication. Tweaks to existing systems in the brain can give us linguistic communication. The idea usually credited to Chomsky of a single (or small number of simultaneous) mutation only 50 to 140 thousand years ago giving a nearly full-blown language facility almost instantaneously has always seemed like a bit of an unlikely miracle. But it also appears to be less and less necessary as language and communication become more understood.
Hasson, U., Ghazanfar, A., Galantucci, B., Garrod, S., & Keysers, C. (2012). Brain-to-brain coupling: a mechanism for creating and sharing a social world Trends in Cognitive Sciences, 16 (2), 114-121 DOI: 10.1016/j.tics.2011.12.007