Main Lab Location:
CiNet (Main bldg.)
Specific Research Topic:
Integration of multimodal information
1-4 Yamadaoka, Suita City Osaka, 565-0871
I study the neural mechanisms underlying multimodal integration.
In our daily life, we are exposed to vast amounts of information from several sensory modalities including vision, audition and touch. Multimodal information originates from the same source in some cases, while it originates from difference sources in other cases. Integration of the related multimodal information is crucial for our cognition of the outside world, but the underlying neural mechanisms are not well understood. We utilize manipulative techniques including neurofeedback to clarify the neural activities that are causally related to mulitisensory integration.
Yuasa, K., Yotsumoto, Y., (2015): Opposite Distortions in Interval Timing Perception for Visual and Auditory Stimuli with Temporal Modulations, PLoS One 10(8), e0135646.