A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is
How do we integrate modality-specific perceptual information arising from the same physical event into a coherent percept? One possibility is that observers rely on information across perceptual modalities that shares temporal structure and/or semantic associations. To explore the contributions of these two factors in multisensory integration, we manipulated the temporal and semantic relationships between auditory and visual information produced by real-world events, such as paper tearing orarXiv:1606.05004v1 fatcat:27apo35apbajbiwehjkmd6rlma