The role of iconic gestures in predictive language processing: Evidence from corpus analyses and anticipatory eye-movements
It is the first study that dedicates to investigate the extent and way in which gestures coordinate with speech to contribute to prediction by combining corpus-analysis with visual world eye-tracking experiment. It will uncover the mechanism of predictive gesture-speech integration during cascaded visual and linguistic processing and advance our understanding about the nature of prediction in the multimodal world of language use. We first analyze a multimodal natural Chinese conversation corpus finding that gestural stroke starts before the lexical affiliate. Based on this, we ask to what extent can iconic gesture predict the upcoming nominal word independently from the predictive power of the linguistic input using visual word eye-tracking design. We expect the predictive power of gesture can be experimentally confirmed by finding that the target object can attract more looks when the gesture is available comparing with that it is absent from the input.
Faculteit der Letteren