This paper presents a deep neural network model for automatically detecting eye contact in egocentric video, achieving accuracy comparable to human experts. The model, trained on a large dataset of annotated images from diverse subjects (including those with Autism Spectrum Disorder), shows high precision (0.936) and recall (0.943) in validation, matching the performance of 10 human coders (mean precision 0.918, recall 0.946). This scalable, objective tool offers significant potential for gaze behavior analysis in clinical and research settings.
Publisher
Nature Communications
Published On
Oct 27, 2020
Authors
Eunji Chong, Elysha Clark-Whitney, Audrey Southerland, Elizabeth Stubbs, Chanel Miller, Eliana L. Ajodan, Melanie R. Silverman, Catherine Lord, Agata Rozga, Rebecca M. Jones, James M. Rehg
Tags
deep neural network
eye contact detection
egocentric video
gaze behavior analysis
Autism Spectrum Disorder
precision
recall
Related Publications
Explore these studies to deepen your understanding of the subject.