{"id":2034,"date":"2018-02-06T18:55:00","date_gmt":"2018-02-06T09:55:00","guid":{"rendered":"http:\/\/cinetjp-static3.nict.go.jp\/english\/?post_type=event&p=2034"},"modified":"2022-10-04T19:06:06","modified_gmt":"2022-10-04T10:06:06","slug":"20180206_2344","status":"publish","type":"event","link":"http:\/\/cinetjp-static3.nict.go.jp\/english\/event\/20180206_2344\/","title":{"rendered":"Gouki Okazawa: \u201cModeling integration of dynamic multi-dimensional sensory evidence for perceptual decision-making\u201d"},"content":{"rendered":"\n

March 12, 2018\u3000\u300012:15 \u301c 13:25<\/p>\n\n\n\n

CiNet 1F Conference Room<\/p>\n\n\n\n

Gouki Okazawa<\/p>\n\n\n\n

Center for Neural Science, New York University, USA<\/p>\n\n\n\n

Host :\u00a0Hiromasa Takemura<\/a>\u00a0(Amano group)<\/p>\n\n\n\n

Abstract:<\/strong><\/p>\n\n\n\n

Perceptual decision-making is a process of commitment to a plan of action based on sensory evidence gathered from the outside world.
Sensory evidence is multi-dimensional and dynamically changing over time; visual signals, for example, consist of spatiotemporal patterns of information and the decision-making process converts these spatiotemporal patterns to an action. In this talk, I will discuss how one can gain insight into this conversion mechanism from behavioral data using psychophysical reverse correlation. First, through computational modeling, I show that psychophysical reverse correlation reflects the complexity of both sensory and decision processes and that one needs a detailed, quantitative model to draw a valid conclusion from the reverse correlation. Second, based on this framework, I show that empirical data obtained from a face discrimination task could be explained by linear spatiotemporal integration of evidence conferred by individual facial features. Together, I propose that one can leverage psychophysical reverse correlation and quantitative behavioral modeling to understand the conversion of sensory signals to an action in perceptual decision-making.<\/p>\n","protected":false},"featured_media":0,"template":"","acf":[],"_links":{"self":[{"href":"http:\/\/cinetjp-static3.nict.go.jp\/english\/wp-json\/wp\/v2\/event\/2034"}],"collection":[{"href":"http:\/\/cinetjp-static3.nict.go.jp\/english\/wp-json\/wp\/v2\/event"}],"about":[{"href":"http:\/\/cinetjp-static3.nict.go.jp\/english\/wp-json\/wp\/v2\/types\/event"}],"wp:attachment":[{"href":"http:\/\/cinetjp-static3.nict.go.jp\/english\/wp-json\/wp\/v2\/media?parent=2034"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}