Shoko Kanaya

Main Lab Location:

CiNet (Main bldg.)

Specific Research Topic:

Visual Perception, Multi-sensory Processing, Psychophysics, fMRI

Mailing Address:

1-4 Yamadaoka, Suita, Osaka 565-0871 Japan


kanaya at

In our daily life, we receive sensory inputs from several sensory modalities such as vision, audition and touch. Signals from those different modalities are processed in distinct brain areas. Even within a single sensory modality (e.g. vision), distinct sensory features such as color and orientation are processed separately. However, we normally perceive single unified objects or events, not fragmented parts of sensory information.

I am interested in how the human brain puts the segregated information together and reconstruct a coherent visual world. My previous research has utilised psychophysical techniques to study the mechanisms underlying multi-sensory integration and extraction of summary statistics in vision. Currently, I am investigating how the depth information derived from binocular disparities is stably represented over the whole visual field, combining psychophysical and brain-imaging techniques.

Selected Publications:

Kanaya, S., Hayashi, M.J., Whitney, D. (2018). Exaggerated groups: amplification in ensemble coding of temporal and spatial features. Proceedings of the Royal Society B, 285: 20172770.

Kanaya, S., Kariya, K., & Fujisaki, W. (2016). Cross-modal correspondence between vision, audition, and touch in natural objects: an investigation of the perceptual properties of wood., Perception. 45(10), 1099-1114.

Kanaya, S., Fujisaki, W., Nishida, S., Furukawa, S., & Yokosawa, K. (2015). Effects of frequency separation and diotic/dichotic presentations on the alternation frequency limits in audition derived from a temporal phase discrimination task., Perception, 44, 198-214.

Kanaya, S., Matsushima, Y., & Yokosawa, K. (2012). Does Seeing Ice Really Feel Cold? Visual-Thermal Interaction under an Illusory Body-Ownership, PLoS ONE, 7, 11: e47293.

Kanaya, S., Yokosawa, K. (2011). Perceptual congruency of audio-visual speech affects ventriloquism with bilateral visual stimuli. Psychonomic Bulletin & Review, 18, 1, 123-128.