Juan Liu

Main Lab Location:
NICT (Keihanna)
Specific Research Topic:
Multisensory perception and action, Human machine interface
Mailing Address:
3-5 Hikaridai, Seika-cho, Soraku-gun, Kyoto, 619-0289, Japan

My research focuses on the mechanism of multisensory integration, including vision, proprioception, touch, and audition, which incorporates past experiences and prior knowledge with present sensory inputs in the perception and action loop.
The ability to flexibly and adaptively integrate information from a variety of sources is a fundamental function of human brain and I believe that this process is tailored to both the internal representation and the external world.
With human psychophysics and virtual reality/robotic systems, I try to identify behavioral strategies and important features of multisensory processing amenable to computational modeling.
My goal is twofold based on the understanding of human sensorimotor processing mechanisms: utilizing multisensory information to effectively influence the formation of internal representation for training or skill acquisition purposes, and devising multisensory interfaces facilitating intuitive and smooth human-machine interaction such as remote operation.

Selected Publications:

Liu, J. & Ando, H. (2018) Response modality vs. target modality: sensory transformations and comparisons in cross-modal slant matching tasks. Scientific Reports 8, Article number: 11068. doi: 10.1038/s41598-018-29375-w.

Liu, J. & Ando, H. (2016) Metal Sounds Stiffer than Drums for Ears, but Not Always for Hands: Low-Level Auditory Features Affect Multisensory Stiffness Perception More than High-Level Categorical Information. PLoS ONE 11: e0167023. doi:10.1371/journal.pone.0167023.