The GRU and Transformer explaining Human Behavioural Data represented by N400.
dc.contributor.advisor | Merkx, D.G.M. | |
dc.contributor.advisor | Slik, F.W.P. van der | |
dc.contributor.author | Boogaart, E. van den | |
dc.date.issued | 2019-06-07 | |
dc.description.abstract | Recurrent Neural Networks (RNN) are a popular type of neural network which are effective at processing language. The Gated Recurrent Unit (GRU) is a well known network that often outperforms other RNNs. Recently, a new neural network architecture has been introduced; the Transformer. In this investigation, the GRU and the Transformer are compared in their ability in predicting human sentence processing. The human language processing data is provided by Electroencephalography (EEG) measuring brain activity. The language models compute surprisal values on a corpus of English sentences. These surprisal values are compared to the human data given by the EEG experiment on the same corpus. The findings show that the GRU and Transformer differ significantly in predicting human language processing data; the Transformer shows higher goodness-of-fit scores for the vast majority of the training. This implies that the Transformer outperforms the GRU as cognitive model. | en_US |
dc.identifier.uri | https://theses.ubn.ru.nl/handle/123456789/8082 | |
dc.language.iso | en | en_US |
dc.thesis.faculty | Faculteit der Letteren | en_US |
dc.thesis.specialisation | International Business Communication | en_US |
dc.thesis.studyprogramme | Bachelor Communicatie- en Informatiewetenschappen | en_US |
dc.thesis.type | Bachelor | en_US |
dc.title | The GRU and Transformer explaining Human Behavioural Data represented by N400. | en_US |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- Boogaart, Ellen van den 4787455_ BA Thesis.pdf
- Size:
- 714.81 KB
- Format:
- Adobe Portable Document Format