|
|
20 bis - Ozyurek, A, Willems, R, Hagoort, P (Nijmegen) - discutant : Feyereisen, P (Louvain)
Session : Panel
20 - “What does the brain reveal about the relations between speech and gesture ?”
Ozyurek, A, Willems, R, Hagoort, P (Nijmegen) - discutant : Feyereisen, P (Louvain) : “Localization and processing of semantic information from speech and gesture in the brain"
Samedi 18 juin- 11h00-11h30
(Amphithéâtre)
Ozyurek, Asli
Willems, Roel
Hagoort, Peter
F.C. Donders Center for Cognitive Neuroimaging, Radboud University, Nijmegen
Localization and processing of semantic information
from speech and gesture in the brain
Previous behavioural research has shown that hand gestures are
produced simultaneously with the semantically relevant speech segment
suggesting a high level of binding between the two modalities
(e.g., a rolling gesture is usually produced with the bracketed part of
speech in the following sentence : “the ball fell and [rolled down] the
street”) (McNeill, 1992, Kita & Ozyurek, 2003). How does the brain
integrate the semantic information coming simultaneously from the
two modalities during online comprehension ?
ERP and fMRI data were gathered while subjects heard spoken sentences
and saw co-speech gestures time-locked to the verbal information
in each sentence as shown in the above example. Either verbal
(i.e., roll down) or gestural content (rolling gesture) matched or
mismatched to the previous context in the utterance (i.e., the ball
fell).
ERP (n=16) measures time locked to the onset of both the verb and
the gesture revealed N400 effects both for language- mismatch and
gesture -mismatch compared to a baseline of matching speech and
gesture. The effect for gesture-mismatch was relatively lesser than
that of language. However the onset of N400 was similar in both
types of mismatch.
The fMRI (n=16) results showed that both types of mismatches
recruited left inferior frontal cortex (i.e., Broca’s area). However
modality specific areas were also observed ; stronger activation in
parietal and right temporal regions were found for gesture- mismatch
and left inferior frontal and left temporal activation for languagemismatch.
We argue that semantic information from speech and gesture are
processed simultaneously and in an integrated way in relation to
previous context of the utterance. However the processing of semantic information also involves modality specifi c brain areas and activation
levels for processing. Further research is needed to investigate
the interactions between the modality specifi c and shared areas of
processing during speech and gesture comprehension.
|
|
|