Multisensory Brand Search: How the Meaning of Sounds Guides Consumers’ Visual Attention
Journal article, Peer reviewed
Date
2016Metadata
Show full item recordCollections
- Scientific articles [2223]
Original version
Journal of Experimental Psychology Applied, 22(2016)2: 196-210 http://dx.doi.org/10.1037/xap0000084Abstract
Building on models of crossmodal attention, the present research proposes that brand search is inherently multisensory, in that the consumers’ visual search for a specific brand can be facilitated by semantically related stimuli that are presented in another sensory modality. A series of five experiments demonstrates that the presentation of spatially non-predictive auditory stimuli associated with products (e.g., usage sounds or product-related jingles) can crossmodally facilitate consumers’ visual search for, and selection of, products. Eye-tracking data (Experiment 2) revealed that the crossmodal effect of auditory cues on visual search manifested itself not only in reaction times, but also in the earliest stages of visual attentional processing, thus suggesting that the semantic information embedded within sounds can modulate the perceptual saliency of the target products’ visual representations. Crossmodal facilitation was even observed for newly-learnt associations between unfamiliar brands and sonic logos, implicating multisensory short-term learning in establishing audiovisual semantic associations. The facilitation effect was stronger when searching complex rather than simple visual displays, thus suggesting a modulatory role of perceptual load.
Description
This article may not exactly replicate the authoritative document published in the APA journal. It is not the copy of record