logo
ResearchBunny Logo
Decoding predicted future states from the brain's "physics engine"

Psychology

Decoding predicted future states from the brain's "physics engine"

R. T. Pramod, E. Mieczkowski, et al.

Discover evidence that a network in the human parietal and frontal lobes runs forward simulations to predict physical events: it encodes object contact and predicts future collisions. This preregistered study, conducted by R. T. Pramod, Elizabeth Mieczkowski, Cyn X. Fang, Joshua B. Tenenbaum, and Nancy Kanwisher, supports the brain’s "physics engine" hypothesis.

00:00
00:00
~3 min • Beginner • English
Abstract
Successful engagement with the physical world requires the ability to predict future events and plan interventions to alter that future. Growing evidence implicates a set of regions in the human parietal and frontal lobes [the "physics network" (PN)] in such intuitive physical inferences. However, the central tenet of this hypothesis, that PN runs forward simulations to predict future states, remains untested. In this preregistered study, we first show that PN abstractly represents whether two objects are in contact, a physical scene property critical for prediction. We then show that PN (but not other visual areas) carries abstract information about predicted future contact events (collisions). These findings support the hypothesis that PN contains a generative model of the physical world that conducts forward simulations, serving as the brain’s "physics engine."
Publisher
Science Advances
Published On
May 30, 2025
Authors
R. T. Pramod, Elizabeth Mieczkowski, Cyn X. Fang, Joshua B. Tenenbaum, Nancy Kanwisher
Tags
physics network
forward simulation
object contact representation
collision prediction
parietal and frontal lobes
generative model
intuitive physical inference
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs, just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny