Skip to content

JiyeongHa/crossmodal

Repository files navigation

Multisensory integration of metaphorically related audiovisual inputs in visual cortex

This repository includes figures, demos, and analyzing codes for the crossmodal project data (presented at 2019 SFN, manuscript in prep).
Find the abstract below the figures.

experimental_Design_and_Psychophysics
Fig 1. Experimental design and psychophysics results


Watch the video Fig 2. Demo for the congruent and incongruent trials


Dividing center-periphery
Fig 3. Definition of central and peripheral visual ROIs


results
Fig 4. Motion direction (upward/downward) classification results for the congruent, incongruent, and ambiguous conditions


results2
Fig 5. Melodic contour (ascending/descending) classification results for the auditory only condition


results3
Fig 6. Searchlight results (medial/lateral)


results4
Fig 7. Searchlight results (dorsal/ventral)


Multisensory integration of metaphorically related audiovisual inputs in visual cortex

Although primary sensory cortices have traditionally been considered to be unimodal in function, recent work has shown that multisensory processing also occurs in early modality-specific areas. In particular, neuroanatomical studies on nonhuman primates and human neuroimaging studies found that the anterior portion of the primary visual cortex (V1), retinotopically mapped to the peripheral visual field, receives extensive feedback signals from primary auditory cortex (A1), which could support multisensory integration at early stages of cortical processing. However, it remains unclear how unimodal sensory information in the early sensory cortex is affected by information from other sensory modalities when information from different senses is metaphorically associated without spatiotemporal correspondence. Using fMRI and multi-voxel pattern classification methods, we examine whether visual motion information in V1 is modulated when a moving stimulus is presented with a melodic contour that is congruent (e.g., ascending melody and upward motion or descending melody and downward motion) or incongruent with respect to the direction of visual motion. While listening to an ascending, descending, or ambiguous melody (auditory), participants viewed random dots moving upward or downward in a circular annulus (visual) and monitored them for occasional changes in their directions. Shepard tones were used to create different melodic contours, which were perceived as infinitely ascending or descending in pitch. The results showed that the direction of visually presented motion was successfully decoded for both congruent and incongruent conditions in V1. More interestingly, the decoding accuracy was significantly higher in the regions of V1 that correspond to the peripheral visual field compared to the regions mapped to the central visual field, only when the directions of visual motion and melodic contour were congruent. Our results suggest that high-level auditory information that has an abstract association with visual properties can modulate visual representations in visual cortex via cortical feedback facilitating multisensory integration in an abstract space.

About

[fMRI] Multisensory integration of metaphorically related audiovisual inputs in visual cortex

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages