Can we go faster than world's fastest brain-computer interface: An application of recurrency to EEG2Code Deep Learning

Keywords
Loading...
Thumbnail Image
Issue Date
2021-07-01
Language
en
Document type
Journal Title
Journal ISSN
Volume Title
Publisher
Title
ISSN
Volume
Issue
Startpage
Endpage
DOI
Abstract
In this research the structure of recurrent neural networks was applied to EEG2Code deep learning. A state of the art high-speed electroencephalogram (EEG) brain-computer interface (BCI) speller, using a deep convolutional neural network. The BCI predicts user intention in a noise-tagging paradigm speller set-up by decoding the EEG data and predicting individual ashes in the noise code. While EEG2Code prides itself as being the current world fasts BCI with an information transfer rate (ITR) of 1237 bits/min with a trial classi cation accuracy of 95:9%. There is still room for improvement on the level of individual ash prediction. Therefore I introduce an extended variant of the EEG2Code deep learning model which makes use of a recurrent neural network layer. The goal of this layer is to capture the temporal dependencies that occur in the data when recording user intention in the hopes of increasing the ability to predict individual ashes with a higher degree of certainty. The recurrent neural network used was a simple recurrent neural network unit. While this addition did not seem to perform as expected, by increasing the performance of individual ash prediction with the initial hyperparameters given, it could serve as a starting point for future researchers to tweak the hyperparameters in an attempt to truly capture the temporal dependencies on an individual ash to ash basis.
Description
Citation
Faculty
Faculteit der Sociale Wetenschappen