Parametric Properties in Chaotic Neural Networks

Keywords

Loading...
Thumbnail Image

Issue Date

2016-08-25

Language

en

Document type

Journal Title

Journal ISSN

Volume Title

Publisher

Title

ISSN

Volume

Issue

Startpage

Endpage

DOI

Abstract

In the framework of reservoir computing a randomly constructed recurrent neural network (RNN) is used to function as a reservoir containing input history and its many random transformations. Output units transform whatever is inside this reservoir into a meaningful output. However, it is di cult to appropriately construct a random reservoir such that the complete model is able to learn and perform a certain task. A model's performance depends on two things: the parameters of the RNN and the task's complexity. The used network parameters need to be ne-tuned to the task at hand for achieving the best results. First we propose and investigate several measures of quantifying the di culty of the chosen to-be-trained pattern. The used patters are visual. Hereafter we investigate RNN performance after FORCE learning on tasks of di erent complexities, selected with the apparently most appropriate of these measures, and we investigate how an RNN's parameters determine its eventual capability to learn. Under investigation are reservoir size, connectivity and weight scaling. All ndings were done through simulation studies, with additional theoretical explanations and references to relevant literature.

Description

Citation

Faculty

Faculteit der Sociale Wetenschappen