|
|
20 - Kelly, S, Ward, S, Creigh, P (Hamilton) - discussant : Feyereisen, P (Louvain)
Session : Panel
20 - “What does the brain reveal about the relations between speech and gesture ?”
Kelly, S, Ward, S, Creigh, P (Hamilton) - discutant : Feyereisen, P (Louvain) : "Does a Communicator’s Intent Play a Role in the Brain’s Comprehension of Speech and Gesture ?"
Samedi 18 juin- 10h30-11h00
(Amphithéâtre)
Kelly, Spencer
Ward, Sarah
Creigh, Peter
Neuroscience Program, Colgate University, Hamilton
Does a Communicator’s Intent Play a Role in the
Brain’s Comprehension of Speech and Gesture ?
Hand gestures are tightly integrated with speech during the brain’s
comprehension of language (Kelly, Kravitz & Hopkins, 2004). Furthermore,
research has demonstrated that gestures play a “special”
role with speech during language comprehension ; the brain processes
gesture differently than other visual information that accompanies
speech (Kelly and Kravitz, 2004). The present study investigates
whether an interlocutor’s belief that speech and gesture are
intentionally linked is one explanation for why gesture plays a special
role during language comprehension.
Ten adult participants watched videos of speech and gesture while
ERPs recorded brain responses to the speech. ERPs are averaged,
time-locked segments of the ongoing electrical activity generated
by the brain (EEG), which can be recorded from the scalp using a
non-invasive electrode net. The gestures on the video had varying
relationships with the accompanying speech : matching, complementary
and mismatching (see Kelly et al, 2004). In addition, there were
two “intentionality” conditions. Half of the stimuli were comprised
of gestures that were intentionally produced with the speech and the
other half were gestures that were produced unintentionally.
Replicating previous studies, we found that gestures infl uenced the
brain’s processing of speech both in early (N1) and late (N400) ERP
components for mismatching versus matching and complementary
conditions. We are currently analyzing the intentionality effects, but
we hypothesize that the gesture effects will be more signifi cant in
the “intentional” versus “unintentional” condition.
The fi nding that gesture infl uences early and late neural processing
of language supports claims that gesture and speech form an
integrated system of communication (McNeill, 1992). Moreover, the hypothesized intentionality result suggests that a communicator’s
intent may be an important component in an interlocutor’s integration
of speech and gesture at comprehension.
|
|
|