logo
ResearchBunny Logo
Neural signatures of emotional intent and inference align during social consensus

Psychology

Neural signatures of emotional intent and inference align during social consensus

M. C. Reddan, D. C. Ong, et al.

Humans quickly infer others’ emotional states — this study uses fMRI from 100 observers rating people describing life events to show that multivariate brain models can predict both a target’s self-rated intent and observers’ inferences. Model correspondence increases with empathic accuracy, yet target intent remains decodable even when observers err. Research conducted by Authors present in <Authors> tag.... show more
Abstract
Humans effortlessly transform dynamic social signals into inferences about other people’s internal states. Here we investigate the neural basis of this process by collecting fMRI data from 100 participants as they rate the emotional intensity of people (targets) describing significant life events. Targets provide self-ratings on the same scale. We then train and validate two unique multivariate models of observer brain activity. The first predicts the target’s self-ratings (i.e., intent), and the second predicts observer inferences. Correspondence between the intent and inference models’ predictions on novel test data increases when observers are more empathically accurate. However, even when observers make inaccurate inferences, the target’s intent can still be predicted from observer brain activity. These findings suggest that an observer’s brain contains latent representations of other people’s socioemotional intensity, and that fMRI models of intent and inference can be combined to predict empathic accuracy.
Publisher
Nature Communications
Published On
Jul 08, 2025
Authors
Marianne C. Reddan, Desmond C. Ong, Tor D. Wager, Sonny Mattek, Isabella Kahhale, Jamil Zaki
Tags
empathic accuracy
fMRI
multivariate brain models
intent prediction
observer inferences
socioemotional intensity
social cognition
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs, just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny