The GRU and Transformer explaining Human Behavioural Data represented by N400.

dc.contributor.advisorMerkx, D.G.M.
dc.contributor.advisorSlik, F.W.P. van der
dc.contributor.authorBoogaart, E. van den
dc.date.issued2019-06-07
dc.description.abstractRecurrent Neural Networks (RNN) are a popular type of neural network which are effective at processing language. The Gated Recurrent Unit (GRU) is a well known network that often outperforms other RNNs. Recently, a new neural network architecture has been introduced; the Transformer. In this investigation, the GRU and the Transformer are compared in their ability in predicting human sentence processing. The human language processing data is provided by Electroencephalography (EEG) measuring brain activity. The language models compute surprisal values on a corpus of English sentences. These surprisal values are compared to the human data given by the EEG experiment on the same corpus. The findings show that the GRU and Transformer differ significantly in predicting human language processing data; the Transformer shows higher goodness-of-fit scores for the vast majority of the training. This implies that the Transformer outperforms the GRU as cognitive model.en_US
dc.identifier.urihttps://theses.ubn.ru.nl/handle/123456789/8082
dc.language.isoenen_US
dc.thesis.facultyFaculteit der Letterenen_US
dc.thesis.specialisationInternational Business Communicationen_US
dc.thesis.studyprogrammeBachelor Communicatie- en Informatiewetenschappenen_US
dc.thesis.typeBacheloren_US
dc.titleThe GRU and Transformer explaining Human Behavioural Data represented by N400.en_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Boogaart, Ellen van den 4787455_ BA Thesis.pdf
Size:
714.81 KB
Format:
Adobe Portable Document Format