In perception, information reaching the central nervous system is processed in specialized cortical areas. For example, visual (V) mouth movements during speech are first processed in the primary visual cortex, and auditory (A) sound waves representing syllables and words are processed in the primary auditory cortex. In everyday situations, this unisensory information is often noisy and fragmented. Thus, following initial perceptual processing, we often need to integrate this fragmented auditory (A) and visual (V) information into a subjectively coherent mental image (I), in order to gain a comprehensive understanding of our multisensory environment. Keil, J., & Senkowski, D. (2018). Neural Oscillations Orchestrate Multisensory Processing. The Neuroscientist, 24(6), 609–626. http://doi.org/10.1177/1073858418755352 In an everyday interaction, we see and hear our partner. This visual (V) and auditory (A) information is processed separately at first and integrated (I) later.
In perception, information reaching the central nervous system is processed in specialized cortical areas. For example, visual (V) mouth movements during speech are first processed in the primary visual cortex, and auditory (A) sound waves representing syllables and words are processed in the primary auditory cortex. In everyday situations, this unisensory information is often noisy and fragmented. Thus, following initial perceptual processing, we often need to integrate this fragmented auditory (A) and visual (V) information into a subjectively coherent mental image (I), in order to gain a comprehensive understanding of our multisensory environment.
Keil, J., & Senkowski, D. (2018). Neural Oscillations Orchestrate Multisensory Processing. The Neuroscientist, 24(6), 609–626. http://doi.org/10.1177/1073858418755352
In an everyday interaction, we see and hear our partner. This visual (V) and auditory (A) information is processed separately at first and integrated (I) later.