Encoding Spatial Information through Sensory Substitution

Keywords

Loading...
Thumbnail Image

Issue Date

2024-07-08

Language

en

Document type

Journal Title

Journal ISSN

Volume Title

Publisher

Title

ISSN

Volume

Issue

Startpage

Endpage

DOI

Abstract

Somatosensory feedback is critically absent from current consumer motor prostheses, leading to poor dexterity of movements and an increased burden on other senses. While the restoration of tactile feedback has seen satisfactory results, the restoration of proprioceptive feedback has lagged behind. In this thesis, I aimed to investigate the applicability of non-invasive sensory substitution to encode spatial variables that can guide movements, serving as a proof of principle for encoding proprioceptive variables for prostheses. An auditory-spatial map was used to encode the one-dimensional position of a computer cursor through the frequency of an audio tone. In the first experiment, the learnability of the auditory-spatial map was tested using a visual target matching task, in which the cursor position was communicated via auditory feedback. The results of the first experiment showed improved accuracy and precision after training, suggesting that participants can learn the auditory-spatial map. In the second experiment, the use of this novel auditory-spatial map was evaluated by testing for sensory integration between auditory and visual cues. To this end, spatial conflicts between auditory and visual information about the position of the cursor in a target matching task were introduced. The visual cue was presented with varying offset and noise levels to derive the relative weighting of both sources of information used by the participants. The data suggest that auditory and visual cues were integrated, but that manipulation of visual cue noise did not affect behavior. Furthermore, computational modeling also showed that models of multisensory integration with a single visual noise level parameter explained the data better than unisensory models. Together, these results suggest the learnability and multisensory integration of a novel auditory-spatial map, presenting a promising proof of principle for using sensory substitution to encode proprioceptive variables in prostheses.

Description

Citation

Faculty

Faculteit der Sociale Wetenschappen