logo
ResearchBunny Logo
The cortical representation of language timescales is shared between reading and listening

Linguistics and Languages

The cortical representation of language timescales is shared between reading and listening

C. Chen, T. D. L. Tour, et al.

This study reveals groundbreaking insights into how our brains process language, revealing shared representations for reading and listening across different timescales. The research, conducted by Catherine Chen, Tom Dupré la Tour, Jack L. Gallant, Daniel Klein, and Fatma Deniz, offers a fascinating glimpse into the cognitive mechanisms underlying language integration.

00:00
00:00
Playback language: English
Abstract
This study investigates whether brain representations of language processing timescales are shared between reading and listening. Using fMRI data from participants reading and listening to the same narratives, voxelwise encoding models were employed to determine cortical representations of different timescales (operationalized as spectral components varying over different word counts). Results show similar timescale representations across the cortex for both modalities, suggesting that after initial sensory processing, language integration proceeds similarly regardless of input modality.
Publisher
Communications Biology
Published On
Mar 07, 2024
Authors
Catherine Chen, Tom Dupré la Tour, Jack L. Gallant, Daniel Klein, Fatma Deniz
Tags
language processing
fMRI
voxelwise encoding models
cortical representations
modality
narratives
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs, just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny