Browse by author
Lookup NU author(s): Dr Quoc Vuong, Mark Laing, Professor Adrian ReesORCiD
This work is licensed under a Creative Commons Attribution 4.0 International License (CC BY 4.0).
The nature of interactions between the senses is a topic of intense interest in neuroscience, but an unresolved question is how sensory information from hearing and vision are combined when the two senses interact. A problem for testing auditory-visual interactions is devising stimuli and tasks that are equivalent in both modalities. Here we report a novel paradigm in which we first equated the discriminability of the stimuli in each modality, then tested how a distractor in the other modality affected performance. Participants discriminated pairs of amplitude-modulated tones or size-modulated visual objects in the form of a cuboid shape, alone or when a similarly modulated distractor stimulus of the other modality occurred with one of the pair. Discrimination of sound modulation depth was affected by a modulated cuboid only when their modulation rates were the same. In contrast, discrimination of cuboid modulation depth was little affected by an equivalently modulated sound. Our results suggest that what observers perceive when auditory and visual signals interact is not simply determined by the discriminability of the individual sensory inputs, but also by factors that increase the perceptual binding of these inputs, such as temporal synchrony.
Author(s): Vuong QC, Laing M, Prabhu A, Tung HI, Rees A
Publication type: Article
Publication status: Published
Journal: Scientific Reports
Year: 2019
Volume: 9
Online publication date: 20/05/2019
Acceptance date: 08/05/2019
Date deposited: 13/05/2019
ISSN (electronic): 2045-2322
Publisher: Nature Publishing Group
URL: https://doi.org/10.1038/s41598-019-44079-5
DOI: 10.1038/s41598-019-44079-5
Altmetrics provided by Altmetric