logo
ResearchBunny Logo
Mobile cognition: imaging the human brain in the ‘real world’

Interdisciplinary Studies

Mobile cognition: imaging the human brain in the ‘real world’

M. Stang, S. L. Maoz, et al.

Mobile neuroimaging enables recording human brain activity during freely moving, real-world behavior, promising unique insights into cognitive mechanisms and novel treatments for neurological and psychiatric disorders. The paper discusses challenges of studying naturalistic behavior and strategies to overcome them, arguing these methods could usher in a new era of cognitive neuroscience. Research conducted by Matthias Stang, Sabrina L. Maoz, and Nanthia Suthana.

00:00
00:00
~3 min • Beginner • English
Introduction
One of the main goals of cognitive neuroscience is to understand how the brain supports natural human behavior and cognition and, ultimately, to be able to detect and treat malfunctions in the underlying neural systems. Research over several decades has provided invaluable insight into the neural mechanisms that support human behavior and cognition, using intelligently designed experimental tasks completed while brain activity is measured using a variety of neuroimaging techniques. Real-world studies without brain recordings have investigated naturalistic behaviors using wearable sensors, smartphone apps, location tracking, ecological assessments, and audiovisual capture, but the inability to record high-quality brain signals simultaneously has limited inference. Traditional neuroimaging requires immobility and tightly-controlled laboratory tasks that may not reflect dynamic real-world scenarios. Animal studies show that spontaneous natural behavior and stimulus naturalism strongly impact neural activity, raising questions about generalizability of laboratory-derived neural mechanisms to real-world human cognition. Large knowledge gaps remain for motor functions, movement-related disorders, spatial navigation and memory in the real world, spontaneous emotional affect and expression, and social interaction. To address these questions, neuroimaging during natural movement and behavior is needed, requiring development of mobile recording methods and adaptations of experimental design and analysis to handle real-world complexity. Recent technological advances enable mobile human neuroimaging, opening new areas of cognitive neuroscience. This Perspective highlights these technologies, novel findings, and potential insights, discusses limitations and challenges, and argues that overcoming them through research–development synergy will transform human cognitive neuroscience.
Literature Review
Traditional neuroimaging methods (fMRI, fNIRS, scalp EEG, MEG, iEEG) have elucidated human cognition but are largely restricted to immobile participants and superficial recordings, limiting access to subcortical regions (basal ganglia, medial temporal lobe). Some studies inferred movement correlates via motor imagery or observation, aiding BMIs, but require direct comparisons with physical movement for validation. VR has been used to simulate mobility while stationary, ranging from view-based 2D displays to immersive headsets with movement (e.g., omnidirectional treadmills), yet neuroimaging sessions remain motionless. It is unclear how VR findings translate to real-life experiences that include idiothetic self-motion cues (proprioceptive, vestibular, motor), rich environmental features, and dynamic demands. Non-human research demonstrates strong effects of natural behavior on neural activity, motivating mobile recordings in humans. Recent advances include wearable scalp EEG with motion artifact reduction and source localization to deep structures; mobile fNIRS with improved motion correction; and mobile MEG using optically pumped magnetometers (OPM-MEG) in shielded rooms. Invasive closed-loop DBS devices provide chronic motion-artifact-free iEEG from deep structures (hippocampus, entorhinal cortex, amygdala, nucleus accumbens) during everyday activities, with stimulation capabilities enabling causal tests. Hyperscanning extends social neuroscience beyond stationary dual-video paradigms to face-to-face multi-person recordings, revealing interbrain synchrony modulations by real-world interactions.
Methodology
This Perspective synthesizes methodological advances enabling mobile human neuroimaging rather than presenting a single empirical protocol. Key approaches include: 1) Mobile scalp EEG: miniaturized wearable systems and advanced artifact correction (e.g., ICA, higher-order statistics) for recordings during movement; high-density source localization enabling analysis of deeper regions (e.g., thalamus, retrosplenial cortex). 2) Mobile fNIRS: wearable multi-channel systems with motion correction to study cortical hemodynamics during real-world tasks, including face-to-face interaction and gait. 3) Mobile MEG (OPM-MEG): optically pumped magnetometers in wearable helmets permitting movement within magnetically shielded environments to capture cortical and subcortical activity. 4) Invasive mobile iEEG via closed-loop DBS and responsive neurostimulation (RNS): chronically implanted electrodes stream deep brain activity during everyday life, with programmable stimulation for causal perturbations and longitudinal recordings (months to years). Platforms such as Mo-DBRS enable synchronized recording and stimulation in freely moving humans. 5) Hyperscanning: simultaneous mobile EEG (and future iEEG/EEG combinations) across interacting individuals to analyze interbrain synchrony in naturalistic social contexts. 6) Multimodal behavioral and physiological data: integration of inertial measurement units (IMUs), optical motion tracking, mobile eye-tracking, audio/video, heart rate, respiration, skin conductance, smartphone-based symptom ratings and environmental measures. Synchronization via time-stamped markers, network time protocols, and platform-specific solutions (e.g., Mo-DBRS, Open Mind Consortium). 7) Analytical frameworks: mixed-effects, multimodal and multivariate models; computational ethology with machine learning and computer vision to quantify behavior and environment; longitudinal analyses to relate brain dynamics to symptom fluctuations and daily activities. Collectively these methodologies increase ecological validity and enable study of cognition during natural movement and complex environments.
Key Findings
The article aggregates recent discoveries enabled by mobile neuroimaging across domains: Spatial navigation and memory: • Mobile iEEG in humans revealed higher-frequency theta (~6–8 Hz) in the medial temporal lobe (MTL) during physical navigation, reconciling prior VR studies in stationary participants reporting lower theta (~1–4 Hz). • Human movement-related theta occurs in bouts, is more prevalent during fast vs. slow walking or stationary periods, and is modulated by proximity to environmental boundaries (walls). • MTL theta power also reflects others’ locations, indicating shared mechanisms for encoding self and other position; boundary-related increases are present only when location is behaviorally relevant. • Mobile scalp EEG/fNIRS studies associate cortical low-frequency oscillations with speed and direction of movement, suggest multisensory integration (visual, kinesthetic, vestibular, proprioceptive), and identify frontal-midline low-frequency bursts during navigation decision-making. • Source-localized EEG implicates deep regions (retrosplenial complex) and shows that tactile environmental interaction (e.g., touching walls) modulates spatial learning. • Real-world mobile EEG shows neural signatures of item–context binding and that environmental features (indoor/outdoor, landmarks) strongly affect memory formation. Social cognition and interaction: • Mobile EEG hyperscanning demonstrates that interbrain oscillatory synchrony in real-world settings predicts class engagement and social dynamics in classrooms (longitudinal recordings across a semester). • In museum dyads, interbrain synchrony is modulated by empathy, closeness, engagement, joint action, and eye contact, highlighting the importance of face-to-face interaction. Movement and motor-related functions: • Mobile EEG and iEEG identify gait-cycle-related modulations in motor/somatosensory regions; upper beta (~13–30 Hz) and high gamma oscillations encode movement onset, termination, muscle synergies, and freezing of gait (Parkinson disease). • Intentional gait modification elicits phase-specific decreases across multiple oscillations, informing BMI design for rehabilitation. • Obstacle avoidance: motor plan updating occurs at obstacle appearance; beta oscillations mark traversal. • Cognitive–motor interference: classical cognitive responses are altered during walking vs. stationary performance in dual-task paradigms; age-specific differences in sensorimotor rhythms suggest resource competition in the elderly. • Mobile approaches extend to expressive movement (dance), revealing neural computations underlying complex coordination, memory, and timing. Emotional affect and expression: • Mobile iEEG enables recording from affective circuits (nucleus accumbens, amygdala, prefrontal cortex) during significant real-world events to study natural fluctuations in affect and fear memory formation. • Environmental context and activity (urban vs. green spaces; active exploration vs. passive viewing) modulate brain activity and emotions in mobile EEG, informing stress recovery mechanisms. • Emotion recognition using non-invasive mobile neuroimaging supports real-time classification for HCI, e-learning, entertainment, and medical monitoring. Clinical mobile iEEG findings: • Parkinson disease: long-term at-home iEEG (≈2,600 hours) shows frequency-specific oscillatory coherence between subthalamic nucleus and motor cortex differentiates dyskinetic on/off states. • OCD: >1,000 hours of at-home iEEG synchronized with heart rate and symptom ratings reveal negative correlation between VC/VS delta (0–4 Hz) power and symptom intensity; VS stimulation increases positive affect. • MDD: state-specific neurophysiological activity responds to VC/VS stimulation in a dose-dependent manner; closed-loop DBS yields clinical improvement. These results demonstrate feasibility and scientific value of long-term mobile brain recordings with minimal intrusion, enabling discovery of biomarkers and personalized closed-loop therapies.
Discussion
Mobile neuroimaging directly addresses the core challenge of studying human cognition and behavior in ecologically valid contexts by enabling brain recordings during natural movement, social interaction, and emotional experiences. Findings across navigation, social interaction, motor control, and affect show that neural dynamics differ meaningfully from stationary laboratory conditions, validating the premise that naturalistic behaviors and multisensory inputs modulate brain activity. Mobile iEEG and non-invasive modalities provide complementary windows into deep and superficial structures, allowing investigation of distributed networks underpinning cognition. Longitudinal mobile recordings capture spontaneous symptom fluctuations and daily-life behaviors, providing clinically actionable biomarkers and facilitating personalized closed-loop interventions for disorders such as Parkinson disease, OCD, and MDD. The integration of wearable behavioral sensors and advanced analytics (mixed-effects models, computational ethology) enables rigorous modeling of real-world complexity to disentangle signal from confounds. Together, these advances support a bidirectional research paradigm in which laboratory models and real-world findings iteratively refine each other, advancing mechanistic understanding and translational applications.
Conclusion
Mobile neuroimaging methods, coupled with wearable technologies and advanced computational analyses, are transforming cognitive neuroscience by enabling investigations of human cognition during natural, freely moving, and socially embedded real-world experiences. This approach tests the generalizability of laboratory-derived models and, critically, generates new insights and theories that reflect ecological complexity. Future development of mobile technologies, including devices capable of recording single-neuron or neurochemical signals during everyday life, will accelerate progress. A synergistic cycle of theory, laboratory experimentation, and real-world study is essential to achieve an ecologically valid understanding of cognition and to inform the development of effective treatments for neurological and psychiatric disorders that translate to real-world settings.
Limitations
Real-world mobile neuroimaging introduces challenges for experimental control due to numerous uncontrolled variables (luminance, sounds, odors, movements, social factors). Studies must innovatively balance control and naturalism and incorporate variables previously considered confounds into design and analysis. Multimodal data acquisition and precise synchronization across sensors (motion tracking, eye-tracking, physiological, audio/video) are required, along with advanced analytics (mixed-effects models, multivariate/multimodal approaches, computational ethology) to quantify behavior and environment. Mobile iEEG has selective electrode coverage driven by clinical indications and limited channels per participant, necessitating larger cohorts and pooled channels; recordings currently derive from patient populations (e.g., epilepsy, movement disorders), raising generalizability concerns. Strategies include algorithmic detection and exclusion of abnormal activity (e.g., epileptiform discharges), recruiting participants with low event rates and favorable treatment response, logging medications, and comparative studies with healthy participants using non-invasive modalities. Non-invasive mobile methods offer broader coverage but stronger signals from superficial structures; combining iEEG with EEG/OPM-MEG/fNIRS can mitigate coverage limitations. Magnetic shielding remains necessary for OPM-MEG, constraining environments. Overall, methodological rigor in design, synchronization, analytics, and patient selection is crucial to ensure interpretability and generalizability.
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs, just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny