Hand gestures

About 20 years ago I took an interest in the non-verbal part of speech communication: gesture, facial expression, posture, tone of voice. During this time I watched the hands of speakers carefully and noted how they gestured. I saw four types of movement that seemed distinct:

One.. Word gestures took the place of words, quite literally. They were made at the point were the word would be used and there was a gap in speaking for it to fit into so to speak. Also treated like words were the (sometimes impolite) gestures for which Italians are famous.

Two.. Illustrating gestures do not interrupt speech but are separately ‘saying’ the same as the words, like miming while talking.

Three.. There are emotional gestures that are very ancient and even understood across species. They are often completely unconscious. Palms towards the body communicate submission or at least non-aggression. Palms away from the body communicate rejection or defense.

Four.. The fourth type is also usually unconscious. I called it baton gestures. They set a rhythm to the speech and quite often the listener moved in keeping with the baton. It also seemed to emphasize important phrases. The baton beat seemed to mark out groups of words that should be processed together, a great help to a listener if they used it to wrap up one meaning and start on the analysis of the next words.

It is this last type that has been the subject of a recent paper. Unfortunately I have no access to the paper and must be content with the abstract. (grr) Here are the abstracts of this paper and an earlier one by the same authors.

Abstract of (Biau, Torralba, Fuentemilla, Balaguer, Soto-Faraco; Speaker’s hand gestures modulate speech perception through phase resetting of ongoing neural oscillations; Cortex Dec 2014) “Speakers often accompany speech with spontaneous beat gestures in natural spoken communication. These gestures are usually aligned with lexical stress and can modulate the saliency of their affiliate words. Here we addressed the consequences of beat gestures on the neural correlates of speech perception. Previous studies have highlighted the role of theta oscillations in temporal prediction of speech. We hypothesized that the sight of beat gestures may influence ongoing low-frequency neural oscillations around the onset of the corresponding words. Electroencephalographic (EEG) recordings were acquired while participants watched a continuous, naturally recorded discourse. The phase-locking value (PLV) at word onset was calculated from the EEG from pairs of identical words that had been pronounced with and without a concurrent beat gesture in the discourse. We observed an increase in PLV in the 5-6 Hz theta range as well as a desynchronization in the 8-10 Hz alpha band around the onset of words preceded by a beat gesture. These findings suggest that beats tune low-frequency oscillatory activity at relevant segments during natural speech perception, providing a new insight of how speech and paralinguistic information are integrated.

Abstract of (Biau, Soto-Faraco; Beat gestures modulate auditory integration in speech perception; Brain and Language 124, 2, Feb 1013) “Spontaneous beat gestures are an integral part of the paralinguistic context during face-to-face conversations. Here we investigated the time course of beat-speech integration in speech perception by measuring ERPs evoked by words pronounced with or without an accompanying beat gesture, while participants watched a spoken discourse. Words accompanied by beats elicited a positive shift in ERPs at an early sensory stage (before 100 ms) and at a later time window coinciding with the auditory component P2. The same word tokens produced no ERP differences when participants listened to the discourse without view of the speaker. We conclude that beat gestures are integrated with speech early on in time and modulate sensory/phonological levels of processing. The present results support the possible role of beats as a highlighter, helping the listener to direct the focus of attention to important information and modulate the parsing of the speech stream.

Also from a summary of an oral presentation by the same group: “We observed an increase in phase-locking at the delta–theta frequency range (2–6 Hz) from around 200 ms before word-onset to 200 ms post word-onset, when words were accompanied with a beat gesture compared to audio alone. Furthermore, this increase in phase-locking, most noticeable at fronto-central electrodes, was not accompanied by an increase in power in the same frequency range, confirming the oscillatory-based nature of this effect. These results suggest that beat gestures are used as robust predictive information capable to tune neural oscillations to the optimal phase for auditory integration of relevant parts of the discourse during natural speech processing.

This research points to a synchronization between speaker and listener where a visual clue is used to divide the speech stream into chunks that can be processed (at least to a large degree) in isolation from the words before and after the chunk. The warning at the beginning of a new chunk, given automatically by the speakers hands, is used automatically by the listener to ‘clear the decks’ and begin a new chunk. This takes some of the strain out of listening. Of course, this information probably is also carried by the voice as well. Redundancy in oral language is common. Conversation is a wonderful dance of voice, face, hands and body that transfers an idea from one brain to another. It only seems easy because of how complicated and automatic it is.

 

Leave a Reply

Your email address will not be published. Required fields are marked *