Spoken language comprehension requires abstracting linguistic information from speech, but the interaction between auditory and linguistic processing of speech remains poorly understood. This study investigates this abstraction using intracranial neural responses while participants listened to conversational English speech. Leveraging language-specific patterns where phonological and acoustic information diverge, the study demonstrates the causal efficacy of the phoneme as a unit of analysis and dissociates the unique contributions of phonemic and spectrographic information to neural responses. Quantitative higher-order response models reveal that unique contributions of phonological information are carried in the covariance structure of the stimulus-response relationship, suggesting linguistic abstraction involves integration across multiple spectro-temporal features and prior phonological information. These results link speech acoustics to phonology and morphosyntax, supporting predictions about abstractness in linguistic theory and providing evidence for the acoustic features supporting that abstraction.
Publisher
Nature Communications
Published On
Jan 23, 2024
Authors
Anna Mai, Stephanie Riès, Sharona Ben-Haim, Jerry J. Shih, Timothy Q. Gentner
Tags
spoken language
auditory processing
linguistic information
neural responses
phonology
speech comprehension
phonemic analysis
Related Publications
Explore these studies to deepen your understanding of the subject.