Searching Near and Far: Investigating Depth-dependent Adaptation of Search Template Size in Naturalistic Visual Search

Thumbnail Image
Issue Date
Journal Title
Journal ISSN
Volume Title
Current theories of visual search assume we create a template representing the target object by pre-activating neurons tuned to target features. When searching in naturalistic scenes, visual features of the target may however change drastically depending on its location in the scene, e.g. its retinal size depends on its distance. To account for this, the template may be rescaled based on depth. In a first experiment, we used breaking continuous flash suppression (b-CFS) we aimed to probe the template formed in a search task requiring participants to take into account depth-dependent size changes and test whether size-matching probes were detected faster. Suppression times to probes were however generally not modulated as a function of their match with target features. Using fMRI and MVPA we investigated the neural basis of the search template, testing whether the expected retinal size of objects participants prepared to search for could be decoded from LOC and whether depth-information from scene-selective areas modulated template size. In line with our hypotheses, we found overlapping voxel activation patterns for seeing objects of varying retinal and preparing to search for these objects near or far within LOC. This effect was however not specific to the search task. While distance-information based on low-level features may have contributed to size or depth-processing in LOC no evidence for a contribution of depth-information from scene-selective areas was found. While further research is needed to understand what specific mechanisms our findings in LOC reflect, these likely still contribute to our ability to account for changes in visual features during search.
Faculteit der Sociale Wetenschappen