logo
ResearchBunny Logo
Abstract
This paper presents a deep neural network model for automatically detecting eye contact in egocentric video, achieving accuracy comparable to human experts. The model, trained on a large dataset of annotated images from diverse subjects (including those with Autism Spectrum Disorder), shows high precision (0.936) and recall (0.943) in validation, matching the performance of 10 human coders (mean precision 0.918, recall 0.946). This scalable, objective tool offers significant potential for gaze behavior analysis in clinical and research settings.
Publisher
Nature Communications
Published On
Oct 27, 2020
Authors
Eunji Chong, Elysha Clark-Whitney, Audrey Southerland, Elizabeth Stubbs, Chanel Miller, Eliana L. Ajodan, Melanie R. Silverman, Catherine Lord, Agata Rozga, Rebecca M. Jones, James M. Rehg
Tags
deep neural network
eye contact detection
egocentric video
gaze behavior analysis
Autism Spectrum Disorder
precision
recall
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs—just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny