Next-word Entropy as a Formalisation of Prediction in Sentence Processing

dc.contributor.advisorFrank, S.L.
dc.contributor.advisorLopopolo, A.
dc.contributor.authorAurnhammer, C.J.
dc.date.issued2018-08-27
dc.description.abstractThis thesis investigates, in an exploratory manner, whether there are independent effects of word integration and prediction in human sentence processing data. We train a probabilistic language model to compute two information theoretic measures: Surprisal for integration cost and next-word entropy for prediction. We evaluate the relation of surprisal and next-word entropy with sentence processing effort on self-paced reading times, eye-tracking gaze durations, EEG responses, and fMRI data. We replicate earlier findings by demonstrating that all data sets are sensitive to surprisal. No effects of next-word entropy are found in the self-paced reading data. On gaze durations an effect of next-word entropy disappears when surprisal is factored out. In both EEG and fMRI data, we find brain activation due to prediction that is distinct from that due to integration. Based on these analyses we support next-word entropy as a valid formalisation of prediction in sentence processing.en_US
dc.identifier.urihttps://theses.ubn.ru.nl/handle/123456789/6190
dc.language.isoenen_US
dc.thesis.facultyFaculteit der Letterenen_US
dc.thesis.specialisationResearchmaster Language and Communicationen_US
dc.thesis.studyprogrammeResearchmastersen_US
dc.thesis.typeResearchmasteren_US
dc.titleNext-word Entropy as a Formalisation of Prediction in Sentence Processingen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
5b83f1db3ad3d-Aurnhammer-Thesis-final.pdf
Size:
2.95 MB
Format:
Adobe Portable Document Format