Investigating the effects of audio-visual spatial congruence on multisensory integration.
MetadataShow full item record
We live in a complex world wherein we are bombarded by many different sensory inputs. Somehow our brain is able to take all of these complex and overlapping sensory inputs and make order out of our surroundings, creating our world and all the various objects in it. Combining information across modalities is behaviourally advantageous, as it has been shown to improve visual search times, provide faster reaction times, and improve accuracy. Specifically, for the purposes of this body of work, we will focus on auditory and visual integration and how each modality may influence perception, and how we can leverage the peculiarities of audio-visual integration for potentially beneficial applications to Virtual Reality. Three studies are reported in this thesis and each provides unique contribution to our understanding of multisensory integration. Specifically these studies are focused on how the spatial relationships of both auditory and visual stimuli affect multisensory integration.
- Speech