DOI: 10.1101/480178Nov 29, 2018Paper

Neural Correlates of Optimal Multisensory Decision Making

BioRxiv : the Preprint Server for Biology
Han HouYong Gu

Abstract

Perceptual decisions are often based on multiple sensory inputs whose reliabilities rapidly vary over time, yet little is known about how our brain integrates these inputs to optimize behavior. Here we show multisensory evidence with time-varying reliability can be accumulated near optimally, in a Bayesian sense, by simply taking time-invariant linear combinations of neural activity across time and modalities, as long as the neural code for the sensory inputs is close to an invariant linear probabilistic population code (ilPPC). Recordings in the lateral intraparietal area (LIP) while macaques optimally performed a vestibular-visual multisensory decision-making task revealed that LIP population activity reflects an integration process consistent with the ilPPC theory. Moreover, LIP accumulates momentary evidence proportional to vestibular acceleration and visual velocity which are encoded in sensory areas with a close approximation to ilPPCs. Together, these results provide a remarkably simple and biologically plausible solution to optimal multisensory decision making.

Related Concepts

Brain
Decision Making
Macaca mulatta
Neuroma
Perception
Vestibule
Invariant chain
Structure of Intraparietal Sulcus
Decision
Population Group

Related Feeds

BioRxiv & MedRxiv Preprints

BioRxiv and MedRxiv are the preprint servers for biology and health sciences respectively, operated by Cold Spring Harbor Laboratory. Here are the latest preprint articles (which are not peer-reviewed) from BioRxiv and MedRxiv.