Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Neuropsychologia, 2023-11, Vol.190, p.108697-108697, Article 108697
2023

Details

Autor(en) / Beteiligte
Titel
The visuo-sensorimotor substrate of co-speech gesture processing
Ist Teil von
  • Neuropsychologia, 2023-11, Vol.190, p.108697-108697, Article 108697
Ort / Verlag
Elsevier Ltd
Erscheinungsjahr
2023
Link zum Volltext
Quelle
Alma/SFX Local Collection
Beschreibungen/Notizen
  • Co-speech gestures are integral to human communication and exhibit diverse forms, each serving a distinct communication function. However, existing literature has focused on individual gesture types, leaving a gap in understanding the comparative neural processing of these diverse forms. To address this, our study investigated the neural processing of two types of iconic gestures: those representing attributes or event knowledge of entity concepts, beat gestures enacting rhythmic manual movements without semantic information, and self-adaptors. During functional magnetic resonance imaging, systematic randomization and attentive observation of video stimuli revealed a general neural substrate for co-speech gesture processing primarily in the bilateral middle temporal and inferior parietal cortices, characterizing visuospatial attention, semantic integration of cross-modal information, and multisensory processing of manual and audiovisual inputs. Specific types of gestures and grooming movements elicited distinct neural responses. Greater activity in the right supramarginal and inferior frontal regions was specific to self-adaptors, and is relevant to the spatiomotor and integrative processing of speech and gestures. The semantic and sensorimotor regions were least active for beat gestures. The processing of attribute gestures was most pronounced in the left posterior middle temporal gyrus upon access to knowledge of entity concepts. This fMRI study illuminated the neural underpinnings of gesture-speech integration and highlighted the differential processing pathways for various co-speech gestures. •We examined neural processing of diverse hand movements/gestures•Hand movements/gestures activated occipitotemporal and inferior parietal regions•Entity-related gestures had distinct brain activity patterns•Category-specific gestural information affected brain activity
Sprache
Englisch
Identifikatoren
ISSN: 0028-3932
eISSN: 1873-3514
DOI: 10.1016/j.neuropsychologia.2023.108697
Titel-ID: cdi_proquest_miscellaneous_2877393305

Weiterführende Literatur

Empfehlungen zum selben Thema automatisch vorgeschlagen von bX