logo
ResearchBunny Logo
Neural Representations of Sensory Uncertainty and Confidence Are Associated with Perceptual Curiosity

Psychology

Neural Representations of Sensory Uncertainty and Confidence Are Associated with Perceptual Curiosity

M. Cohanpour, M. Aly, et al.

Humans are driven to reduce uncertainty — this fMRI study finds that low subjective confidence and a novel “OTC Certainty” measure in occipitotemporal cortex predict spikes in perceptual curiosity, while vmPFC and ACC track confidence and vmPFC mediates the OTC–curiosity link. Research conducted by Michael Cohanpour, Mariam Aly, and Jacqueline Gottlieb.

00:00
00:00
~3 min • Beginner • English
Introduction
The study investigates how neural representations of uncertainty give rise to feelings of curiosity. Prior behavioral work shows an inverse relationship between confidence and curiosity, but the neural link has been unclear, especially for complex, naturalistic stimuli where uncertainty about semantic content is difficult to quantify. The authors hypothesize that higher-level visual cortex (occipitotemporal cortex, OTC) provides multivariate representations of sensory certainty/uncertainty about categories, which are read out by frontal regions into a univariate confidence signal that in turn relates to curiosity. They test whether multivariate certainty in OTC predicts trial-by-trial confidence and curiosity for ambiguous images and whether frontal regions (vmPFC, ACC) mediate this link.
Literature Review
Curiosity has been studied using trivia paradigms, showing that curiosity engages motivation and reward circuits and enhances learning via hippocampal–dopaminergic interactions. Confidence and curiosity ratings are inversely related in such tasks. Sensory uncertainty is represented in early visual cortex (V1) as population codes, whereas subjective confidence is encoded in frontal regions, notably vmPFC and ACC, often as univariate BOLD signals. A readout hypothesis posits frontal areas transform multivariate sensory representations into lower-dimensional confidence signals. Prior work decoded uncertainty from V1 using assumed tuning (e.g., orientation), but it is unknown how uncertainty is represented for complex categories in higher-level visual cortex (OTC) and how this relates to curiosity.
Methodology
Participants: 32 right-handed adults (17 female), ages 18–35 (mean 27.2, SD 4.5), education mean 16.1 years (SD 2.8), normal or corrected-to-normal vision, compensated $40. IRB-approved, informed consent obtained. Stimuli: 42 animal and 42 man-made object images (Konkle Lab database) normalized for luminance/contrast (SHINE Toolbox). “Texforms” generated using a texture synthesis algorithm (Deza et al., 2019): thousands of first- and second-order statistics computed across pooling regions; texforms synthesized from noise via stochastic gradient descent (100 iterations). Spatial pooling factor fixed at 0.28 across images. Low-level properties quantified: luminance (mean intensity), RMS contrast (SD of luminance), and spatial frequency (2D FFT-based power spectrum slope). Task design: Perceptual curiosity task with 84 trials (4 runs). Each trial: texform displayed 4 s (participant forms best guess), then confidence rating (0–100) and curiosity rating (0–100) via MR trackball (up to 5 s each; initial slider randomized), then clear (original) image shown 2 s. Participants informed images equally likely animal/object. Fixed payment ensured ratings independent of incentives. After main task, an unannounced localizer run: 24 miniblocks (12 animal, 12 man-made), each with 20 undistorted images (333 ms/image; 333 ms ISI), separated by 13 s fixation; one-back cover task. MRI acquisition: 3T Siemens Prisma, 64-channel head coil. Functional multiband EPI: TR 2 s, TE 30 ms, flip angle 80°, acceleration factor 3, voxel size 2 mm isotropic, 69 axial slices (14° transverse to coronal), posterior–anterior phase encoding. Five runs (4 task, 1 localizer). Structural T1-weighted MPRAGE 1.0 mm isotropic. Preprocessing/analysis: FSL (FEAT, FNIRT, fslmaths); BET brain extraction; MCFLIRT motion correction; high-pass filtering (cutoff 100 ms); 3 mm FWHM smoothing; nonlinear registration to 1 mm MNI152 with 12 DOF; FILM prewhitening. Subsequent analyses in MATLAB. ROIs: vmPFC ROI from Mackey & Petrides (2014) (excluding corpus callosum overlap). ACC and OTC ROIs from Harvard–Oxford atlas (50% threshold). ROIs in 1.0 mm MNI space. GLMs: Localizer GLM (#1): boxcar per miniblock (16.66 s), motion regressors and derivatives; HRF double-gamma. Generated beta maps per miniblock; within OTC, averaged animal miniblock betas to create animal template and man-made miniblock betas to create man-made template. Verified category reliability via higher within-category than between-category pattern correlations. Single-trial GLM (#2): modeled each texform presentation (4 s) as separate regressor per trial; nuisance regressors modeled rating periods (duration = RT) and clear-image periods (2 s), plus motion and derivatives; HRF double-gamma; FILM prewhitening. Extracted univariate beta (mean within ROI) and multivariate voxel patterns per trial for OTC; univariate for vmPFC, ACC, and OTC. OTC Certainty metric: For each trial, computed Pearson correlations between texform-evoked OTC pattern and animal template (r_a) and man-made template (r_mm). Defined OTC Certainty = mean(r_a, r_mm) * |r_a − r_mm|, reflecting model certainty (mean) and approximation certainty (absolute difference) from machine learning uncertainty frameworks. Statistics: Linear/quadratic mixed-effects models (MATLAB fitlme) with participant-specific random slopes/intercepts. Models tested: curiosity ~ confidence (+ confidence^2); confidence/curiosity ~ OTC Certainty (± quadratic term); vmPFC/ACC activity ~ confidence or curiosity; vmPFC/ACC activity ~ OTC Certainty. Controlled for low-level image properties (luminance, contrast, spatial frequency) where relevant. Independent variables z-scored within participant. Model comparisons via BIC. Mediation analysis (Baron & Kenny; MATLAB implementation with bootstrap, 1,000 iterations) tested whether vmPFC or ACC activity mediates OTC Certainty–curiosity link, evaluating change from c to c′ while including quadratic terms where pairwise relations were quadratic (mediation inference on linear terms).
Key Findings
- Behavior: Curiosity and confidence showed a negative quadratic relationship; curiosity peaked at relatively low confidence and declined with higher confidence. Mixed-effects model: β_linear = −13.46, p < 0.0001, 95% CI [−15.8, −11.0]; β_quadratic = −5.60, p < 0.0001, 95% CI [−7.21, −3.99]. Quadratic model outperformed linear (BIC_quadratic − BIC_linear = −180). Demographics and low-level image properties did not significantly predict curiosity or confidence. - OTC category templates: In OTC, localizer miniblocks showed higher within-category vs between-category pattern correlations (mean r: 0.80 vs 0.58; p = 0.008). Texform patterns correlated more with the matching than non-matching category template (r: 0.50 vs 0.43; p = 0.01). - OTC Certainty relationships: OTC Certainty positively predicted confidence (β = 1.95; p = 0.0008; 95% CI [0.80, 3.09]) and negatively predicted curiosity (β = −1.21; p = 0.007; 95% CI [−2.08, −0.33]). Linear models were favored over quadratic (confidence BIC difference = 17; curiosity BIC difference = 23). Univariate OTC activity did not predict curiosity (β = −0.23; p = 0.63). The OTC Certainty–curiosity link remained significant when controlling for luminance, contrast, and spatial frequency, none of which were significant predictors. - Control analyses: Z-scoring within-participant, alternative normalizations, removal of rare negative correlation cases (<4% of data), and controls for motion, cursor starting position, and fixed distortion scaling did not alter results. - V1 analyses: V1 patterns did not differentiate categories (r: 0.57 vs 0.57; p = 0.90). Certainty derived from V1 did not predict confidence (β = 0.12; p = 0.8) or curiosity (β = 0.04; p = 0.9). - Frontal ROIs: Univariate vmPFC and ACC activity scaled positively with confidence (vmPFC β = 3.32; p < 0.0001; 95% CI [1.80, 4.85]. ACC β = 2.34; p < 0.0001; 95% CI [0.88, 3.79]) and negatively with curiosity (vmPFC β_linear = −2.43; p < 0.0001; 95% CI [−3.58, −1.28]; β_quadratic = −0.66; p = 0.005. ACC β_linear = −1.14; p < 0.0001; 95% CI [−2.26, −0.02]; β_quadratic = −0.53; p = 0.005). Low-level image properties did not explain vmPFC/ACC activity; confidence and curiosity effects persisted with covariates. - OTC Certainty–frontal activity: OTC Certainty correlated with vmPFC (β_linear = 5.55; p < 0.0001; BIC favored linear) and ACC (β_linear = 9.00; p < 0.0001). - Mediation: OTC Certainty negatively associated with curiosity (c = −0.046; p = 0.009). vmPFC mediated this link: c′ = −0.036 (p = 0.08), with significant reduction (c′ − c = 0.0109; two-tailed p < 0.001). ACC did not mediate: c′ = −0.044 (p = 0.02); c′ − c = 0.002 (p = 0.30); ACC c′ was significantly stronger than vmPFC c′ (KS test ks = 0.60; p < 0.001).
Discussion
Findings support a mechanistic link from multivariate sensory representations of category certainty in OTC to subjective confidence signals in vmPFC, which in turn relate inversely to curiosity. The negative quadratic relation between confidence and curiosity mirrors epistemic curiosity findings with trivia, extending them to perceptual curiosity for complex visual stimuli. The newly defined OTC Certainty metric captures two complementary components of certainty (model and approximation certainty) without assuming specific tuning curves, enabling quantification in higher-order visual cortex. The vmPFC, but not ACC, statistically mediates the relationship between OTC Certainty and curiosity, suggesting vmPFC transforms multivariate sensory evidence into a univariate confidence signal that influences curiosity. ACC activity also relates to confidence and curiosity but appears to do so through mechanisms independent of OTC Certainty, potentially reflecting roles in cognitive control or information gathering rather than the generation of the subjective curiosity state. Control analyses indicate these relationships are not driven by low-level image features or early visual cortex representations.
Conclusion
The study introduces a generalizable metric of multivariate sensory certainty in higher-level visual cortex (OTC) and demonstrates that this certainty predicts confidence and curiosity. A vmPFC-mediated pathway links OTC Certainty to curiosity, consistent with a transformation from high-dimensional sensory codes to low-dimensional control-relevant signals (confidence) that modulate curiosity. These findings elucidate how uncertainty about an event produces curiosity about that event and suggest that similar transformations may occur across domains where information is represented probabilistically. Future research could extend certainty measures to exemplar-level representations, test additional neural pathways contributing to curiosity, examine tasks requiring effortful information seeking to clarify ACC’s role, and explore how different levels of the sensory hierarchy contribute distinct forms of uncertainty to confidence and curiosity.
Limitations
- The OTC Certainty metric was computed at the category level (animal vs man-made), while participants may have generated exemplar-level guesses; although the method assumes category-consistent activation, future work should examine exemplar-level certainty. - V1 analyses did not reveal category-specific certainty; however, V1 may encode uncertainty about local features not captured by the present approach, which could contribute to curiosity under different task demands. - Mediation inferences are based on functional associations (correlation/mediation) rather than direct anatomical pathways; causal and circuit-level mechanisms remain to be established. - The high-pass filter and other preprocessing choices, as well as the fixed distortion level, may influence generalizability; replication with varied parameters and stimuli would strengthen conclusions.
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs, just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny