Using Negative Expected Divergence as an Alternative Cost Function in the Psi-marginal Adaptive Method
Keywords
No Thumbnail Available
Authors
Issue Date
2018-06-18
Language
en
Document type
Journal Title
Journal ISSN
Volume Title
Publisher
Title
ISSN
Volume
Issue
Startpage
Endpage
DOI
Abstract
Trying to nd the most informative stimuli for nding model parameters within a short timespan
is an important problem in psychometric experiments. The psi-marginal adaptive method is one of the
Bayesian adaptive estimation algorithms that is used to estimate these parameters through presenting
informative stimuli. However, this algorithm still performs badly in experiments where multiple free
parameters have to be estimated from multidimensional stimuli.
In this thesis I will research the usability of a novel cost function based on uncertainty sampling called
negative expected divergence (NED) as an alternative to the commonly used entropy in this psi-method
and test whether this function improves the performance of this algorithm in a simple two-alternative
forced choice experiment. My focus will be on using these cost functions in a practical setting where
stimuli have to be computed in real-time for human subjects.
NED is faster in terms of convergence speed, but it is outperformed by entropy in computational time
and reliability. The two cost functions share a similar performance in accuracy. In a practical setting, the
time between presented stimuli and the reliability of the results of a single experiment are of importance.
Therefore entropy is a better cost function to use in similar experiments. However, there are alterations
that could be made to my implementation of NED to improve its performance. Both algorithms have
also not yet been tested against each other in higher dimensional experiments.
Description
Citation
Supervisor
Faculty
Faculteit der Sociale Wetenschappen