
Psychology
Brain-wide dynamics linking sensation to action during decision-making
A. Khilkevich, M. Lohse, et al.
Discover how brain areas collaborate during perceptual decision-making! This exciting research by Andrei Khilkevich and team reveals how sensory input and motor planning are integrated across multiple regions in the brain, leading to improved action preparation and decision-making processes.
Playback language: English
Introduction
Perceptual decisions require the brain to learn associations between sensory evidence and appropriate actions. This involves filtering and integrating relevant inputs to generate timely responses. While task-relevant representations are distributed across the brain, the mechanisms orchestrating the transformations between sensory input, evidence integration, motor planning, and execution remain unclear. This study aims to address this gap by examining brain-wide neural activity in mice learning to report changes in ambiguous visual input. Understanding how the brain transforms sensory input into actions is crucial for linking external events to beneficial behaviors. Current understanding suggests that the brain gradually accumulates sensory input into an integrated neural representation that determines the upcoming choice. While evidence integration has been observed in several areas, particularly frontal-premotor and posterior parietal cortices and downstream targets like the striatum, recent studies indicate broader encoding of sensory inputs, choices, and actions throughout the brain. This raises questions about the location and nature of these transformations and whether they are shaped by inherent regional properties or learning. The study also investigates how the distinct dynamics of evidence integration, movement preparation, and execution are orchestrated across brain regions and dimensions of neural activity, specifically addressing the extent to which these transformations are segregated or parallelized. To separate sensory and decision-related processes from action signals, the researchers utilize a visual change detection task adapted for rodents, enabling dense electrode recordings for comprehensive brain-wide measurements.
Literature Review
Existing literature highlights the importance of learned associations between sensory evidence and actions in perceptual decision-making. Studies have identified evidence integration in areas like frontal-premotor cortex and posterior parietal cortex and their downstream targets such as the striatum. However, recent research reveals a broader encoding of sensory inputs, choices, and actions across the brain. The precise location and mechanisms of these transformations and the influence of learning versus inherent regional properties remain open questions. While evidence integration has been linked to preparatory activity in specific brain regions, the broader impact on movement-related dynamics and the involved regions are not well-understood on a brain-wide scale. The current study addresses these knowledge gaps using a refined task and methodology to distinguish between sensory evidence and motor signals.
Methodology
To investigate brain-wide transformations of sensory input into choice and action, food-restricted, head-fixed mice were trained on a visual change detection task. Mice observed a drifting grating stimulus with noisy speed fluctuations and reported sustained increases in speed by licking a reward spout. The task was designed to separate ongoing visual evidence observation from movement-related activity by limiting reward accessibility and aborting trials with premature movements. Dense silicon electrode recordings (Neuropixels probes) captured activity from 15,406 units across 51 brain regions (cortex, basal ganglia, hippocampus, thalamus, midbrain, cerebellum, and hindbrain) in 15 mice across 114 sessions. High-speed videography recorded face and pupil movements and running wheel activity. Single-cell Poisson generalized linear models (GLMs) analyzed trial-to-trial neural activity, identifying neurons encoding visual evidence (stimulus TF), lick preparation, and lick execution. To analyze sensory evidence propagation, neural responses to fast TF pulses were quantified, assessing peak time and duration. The study further investigated parallel sensory integration in premotor areas by analyzing the responses to multiple fast TF pulses, calculating response facilitation, and determining the correlation between facilitation and response duration. Behavioral analysis, including psychophysical kernel analysis and modeling, was used to differentiate between integration and outlier detection strategies. The researchers also compared neural activity in trained and untrained mice to determine the role of learning in sensory evidence encoding. Finally, to understand the transformation of integrated evidence into action preparation, the alignment of population vectors for responses to fast TF pulses and preparatory activity before lick onset were calculated, and orthogonal subspaces of population activity for movement preparation and execution were identified and analyzed.
Key Findings
The study revealed surprisingly widespread sensory evidence representations, with a sparse subpopulation of neurons tracking subtle visual input fluctuations in nearly all brain areas except those controlling orofacial movements. Sensory responses evolved from brief activations in early visual areas to sustained representations in downstream regions, indicating parallel evidence accumulation. Behavioral analysis confirmed that mice used temporal integration of stimulus information (approximately 0.25s), aligning with the observed longer timescales of neural responses outside the visual system. Regions integrating evidence exhibited response facilitation to sequential fast TF pulses, correlating with response duration. The encoding of sensory evidence outside the visual system was largely learning-dependent, as untrained mice did not show similar responses in non-visual areas. Importantly, the timescales of integration were not determined by intrinsic regional dynamics, suggesting a role for task experience. The study further showed that evidence integration and movement preparation were encoded in the same population activity subspace, orthogonal to movement-related dynamics. TF-responsive neurons disproportionately contributed to preparatory activity in this subspace, and this activity collapsed at movement onset, resetting the integration process. This orthogonalization allowed concurrent processing of evidence accumulation and movement planning and execution.
Discussion
The findings demonstrate that evidence integration is a widespread, learning-dependent phenomenon implemented by a sparse neural population across many premotor areas. The integration timescales were independent of intrinsic regional dynamics, highlighting the role of task experience in shaping these computations. The concurrent encoding of evidence integration and movement preparation in an orthogonal subspace, distinct from movement execution dynamics, provides a mechanism for parallel processing and efficient action selection. The observation that activity in this subspace collapsed at movement onset suggests a reset mechanism allowing for continuous integration of new sensory information. This work unifies concepts from decision-making and motor control, providing a comprehensive framework for how sensory evidence guides actions through global neural mechanisms. The study challenges the notion of sequential processing across specialized brain areas, suggesting instead a parallel, distributed computation across multiple regions.
Conclusion
This study provides a comprehensive brain-wide perspective on the neural mechanisms underlying perceptual decision-making. The findings highlight the widespread and parallel nature of evidence integration, its dependence on learning, and its tight coupling with movement preparation within distinct orthogonal subspaces of neural activity. Future research could explore the generalizability of these principles to tasks with multiple sensorimotor contingencies and investigate the specific mechanisms generating the action initiation signal.
Limitations
The study focused on a specific visual change detection task in mice, limiting the generalizability to other tasks or species. The GLM analysis relies on certain assumptions about neural responses that could affect the interpretation of the results. The study primarily focused on early lick responses, potentially overlooking other aspects of the decision-making process.
Related Publications
Explore these studies to deepen your understanding of the subject.