Klemens Knöferle har publisert en fagartikkel med flere medforfattere i Journal of Experimental Psychology. Applied
Building on models of crossmodal attention, the present research proposes that brand search is inherently multisensory, in that the consumers’ visual search for a specific brand can be facilitated by semantically related stimuli that are presented in another sensory modality. A series of five experiments demonstrates that the presentation of spatially non-predictive auditory stimuli associated with products (e.g., usage sounds or product-related jingles) can crossmodally facilitate consumers’ visual search for, and selection of, products. Eye-tracking data (Experiment 2) revealed that the crossmodal effect of auditory cues on visual search manifested itself not only in reaction times, but also in the earliest stages of visual attentional processing, thus suggesting that the semantic information embedded within sounds can modulate the perceptual saliency of the target products’ visual representations. Crossmodal facilitation was even observed for newly-learnt associations between unfamiliar brands and sonic logos, implicating multisensory short-term learning in establishing audiovisual semantic associations. The facilitation effect was stronger when searching complex rather than simple visual displays, thus suggesting a modulatory role of perceptual load.
Knöferle, Klemens M. et al. 2016. "Multisensory Brand Search: How the Meaning of Sounds Guides Consumers’ Visual Attention.” Journal of Experimental Psychology Applied, 22(2):196-210