Next-word Entropy as a Formalisation of Prediction in Sentence Processing
Keywords
Loading...
Authors
Issue Date
2018-08-27
Language
en
Document type
Journal Title
Journal ISSN
Volume Title
Publisher
Title
ISSN
Volume
Issue
Startpage
Endpage
DOI
Abstract
This thesis investigates, in an exploratory manner, whether there are independent effects of word integration and prediction in human sentence processing data. We train a probabilistic language model to compute two information theoretic measures:
Surprisal for integration cost and next-word entropy for prediction. We evaluate the relation
of surprisal and next-word entropy with sentence processing effort on self-paced reading times, eye-tracking gaze durations, EEG
responses, and fMRI data.
We replicate earlier findings by demonstrating that all data sets are sensitive to
surprisal. No effects of next-word entropy are found in the self-paced
reading data. On gaze durations an effect of next-word entropy disappears when surprisal is factored out.
In both EEG and fMRI data, we find brain activation due to prediction that is
distinct from that due to integration. Based on these analyses we support next-word
entropy as a valid formalisation of prediction in sentence processing.
Description
Citation
Supervisor
Faculty
Faculteit der Letteren