Visual and Tactile Search
Spatially uninformative auditory cue and visual search
Spatially informative auditory cues has long been used for guiding attention in visual search. Recently study by van der Burg et al. (2008) showed that spatially uninformative sounds can also enhance visual search when the sounds are synchronized with color changes of visual target, a phenomenon referred to as "pip-and-pop" effect. In this project, we are interested in the relationship of this "pip-and-pop" effect to changes in oculomotor scanning behavior induced by the sounds. In our recent study (Zou, Müller, Shi. 2012), we showed that sound events to increase fixation durations upon their occurrence and to decrease the mean number of saccades, suggesting that non-spatial sounds cause a general freezing effect on oculomotor scanning behavior, an effect which in turn benefits visual search performance by temporraly and spatially extended information sampling.
Heng Zou, Hermann J. Müller, and Zhuanghua Shi, J Vis May 4, 2012 12(5): 2; doi:10.1167/12.5.2
Contextual cueing effect and its application
Invariant spatial context can benefit visual search task with a faster response, which has been termed as contextual cueing effect. In most mobile devices, application icons are arranged in relative fixed configuration. When the holding position of the mobile device changes, all icons are shuffled and remppaed to a new location with linear position order. This, however, may discrupt learned spatial configuration, which in turn impedes search performance. In this project, we aim to compare search performances in terms of contextual cueing effect under different spatial remapping methods.
Contextual cueing and tactile search
In this project we investigate how contextual learning influences tactile search and its underlying spatial reference frames. Previous studies on contextual learning and search have been exclusively done in visual domain. Here we setup vibrotactile search interface, analogue to visual search display, presenting vibrotactile stimuli to participants' fingers. Analogue to visual contextual cueing paradigm, we compared search performance between repeated and non-repeated tactile search arrays, and found reaction times were faster for the repeated tactile search arrays, even though those arrays couldnot be explicitly recognized (Assumpcao et al., 2015) . In several ongoing studies we now examine how tactile spatial information is encoded in the brain.
- Assumpção, L., Shi, Z., Zang, X., Müller, H. J., & Geyer, T. (2015), Contextual cueing: implicit memory of tactile context facilitates tactile search. Attention, Perception, & Psychophysics. doi:10.3758/s13414-015-0848-y
- Shi, Z., Zang, X., Jia, L., Geyer, T., & Müller, H. J. (2013). Transfer of contextual cueing in full-icon display remapping. Journal of Vision, 13(3), 2, 1–10. doi:10.1167/13.3.2.
- Zang, X., Jia, L., Müller, H. J., Shi, Z. * (2014), Invariant spatial context is learned but not retrieved in gaze-contingent tunnel-view search, Journal of Experimental Psychology: Learning, Memory, and Cognition. DOI: 10.1037/xlm0000060
- Zou, H., Müller, H. J., & Shi, Z. (2012). Non-spatial sounds regulate eye movements and enhance visual search. Journal of Vision, 12(5), 1–18. doi:10.1167/12.5.2.