Event-based Near-eye Gaze Tracking using a Spiking Neural Network
Keywords
No Thumbnail Available
Authors
Issue Date
2025-01-08
Language
en
Document type
Journal Title
Journal ISSN
Volume Title
Publisher
Title
ISSN
Volume
Issue
Startpage
Endpage
DOI
Abstract
—This work introduces GazeSCRNN, a novel spiking
convolutional recurrent neural network designed for event-based
near-eye gaze tracking. Leveraging the high temporal resolution,
energy efficiency and compatibility of Dynamic Vision Sensor
(DVS) cameras with event-based systems, GazeSCRNN uses a
spiking neural network (SNN) to address the limitations of tra
ditional gaze-tracking systems in capturing dynamic movements.
The proposed model processes event streams from DVS cameras
using Adaptive Leaky-Integrate-and-Fire (ALIF) neurons and a
hybrid architecture optimized for spatio-temporal data. Extensive
evaluations on the EV-Eye dataset demonstrate the model’s
accuracy in predicting gaze vectors. In addition, we conducted
ablation studies to reveal the importance of the ALIF neurons,
dynamic event framing, and advanced training techniques, such
as Forward-Propagation-Through-Time, in enhancing overall
system performance. The most accurate model achieved a Mean
Angle Error (MAE) of 6.034° and a Mean Pupil Error (MPE) of
2.094 mm. Consequently, this work is pioneering in demonstrat
ing the feasibility of using SNNs for event-based gaze tracking,
while shedding light on critical challenges and opportunities for further improvement.
Index Terms—Gaze tracking, Spiking neural network, Event
based vision, Neuromorphic computing
Description
Citation
Faculty
Faculteit der Sociale Wetenschappen