Geometrically analysing synaptic delays in recurrent neural networks

Keywords

Loading...
Thumbnail Image

Issue Date

2022-07-07

Language

en

Document type

Journal Title

Journal ISSN

Volume Title

Publisher

Title

ISSN

Volume

Issue

Startpage

Endpage

DOI

Abstract

Temporal dynamics in recurrent neural networks make them incredibly useful, with uses such as machine translations, speech recognition, and handwriting recognition. However, recurrent neural networks rely on complicated computations which make it difficult to interpret the network dynamics. Information transfer in these networks is also often falsely assumed to be instantaneous, ignoring the synaptic delays that occur within the brain. This creates a gap between natural- and artificial neural networks, which also persists onto hardware through propagation delays. This paper proposes a way to intuitively display network dynamics through the use of vector fields, and analyses the implementation and impact of synaptic delays on recurrent neural networks using these geometric tools. Using these methods, several patterns were found. Delays create a temporary alternate fixed point to which the system converges, steering back to the fixed point without delays in a tempo dependent on the delay duration. Randomised delays followed a similar pattern, but displayed more erratic behaviour. These basic patterns seen in simple networks could be used to explain network behaviour in complex systems.Temporal dynamics in recurrent neural networks make them incredibly useful, with uses such as machine translations, speech recognition, and handwriting recognition. However, recurrent neural networks rely on complicated computations which make it difficult to interpret the network dynamics. Information transfer in these networks is also often falsely assumed to be instantaneous, ignoring the synaptic delays that occur within the brain. This creates a gap between natural- and artificial neural networks, which also persists onto hardware through propagation delays. This paper proposes a way to intuitively display network dynamics through the use of vector fields, and analyses the implementation and impact of synaptic delays on recurrent neural networks using these geometric tools. Using these methods, several patterns were found. Delays create a temporary alternate fixed point to which the system converges, steering back to the fixed point without delays in a tempo dependent on the delay duration. Randomised delays followed a similar pattern, but displayed more erratic behaviour. These basic patterns seen in simple networks could be used to explain network behaviour in complex systems.

Description

Citation

Faculty

Faculteit der Sociale Wetenschappen