Parametric Properties in Chaotic Neural Networks

dc.contributor.advisorGerven, M.A.J. van
dc.contributor.advisorQuax, S.C.
dc.contributor.authorVliex, L.C.H.
dc.date.issued2016-08-25
dc.description.abstractIn the framework of reservoir computing a randomly constructed recurrent neural network (RNN) is used to function as a reservoir containing input history and its many random transformations. Output units transform whatever is inside this reservoir into a meaningful output. However, it is di cult to appropriately construct a random reservoir such that the complete model is able to learn and perform a certain task. A model's performance depends on two things: the parameters of the RNN and the task's complexity. The used network parameters need to be ne-tuned to the task at hand for achieving the best results. First we propose and investigate several measures of quantifying the di culty of the chosen to-be-trained pattern. The used patters are visual. Hereafter we investigate RNN performance after FORCE learning on tasks of di erent complexities, selected with the apparently most appropriate of these measures, and we investigate how an RNN's parameters determine its eventual capability to learn. Under investigation are reservoir size, connectivity and weight scaling. All ndings were done through simulation studies, with additional theoretical explanations and references to relevant literature.en_US
dc.identifier.urihttp://hdl.handle.net/123456789/2622
dc.language.isoenen_US
dc.thesis.facultyFaculteit der Sociale Wetenschappenen_US
dc.thesis.specialisationBachelor Artificial Intelligenceen_US
dc.thesis.studyprogrammeArtificial Intelligenceen_US
dc.thesis.typeBacheloren_US
dc.titleParametric Properties in Chaotic Neural Networksen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Vliex, L._BSc-Thesis_2016.pdf
Size:
1.78 MB
Format:
Adobe Portable Document Format
Description:
Thesis text