logo
ResearchBunny Logo
B-SOID, an open-source unsupervised algorithm for identification and fast prediction of behaviors

Biology

B-SOID, an open-source unsupervised algorithm for identification and fast prediction of behaviors

A. I. Hsu and E. A. Yttri

Get ready to uncover the secrets of animal behavior with groundbreaking research by Alexander I. Hsu and Eric A. Yttri. Introducing B-SOID, an unsupervised algorithm that revolutionizes the identification of behaviors through spatiotemporal pose patterns. This innovative approach not only enhances processing speed but also breaks barriers in studying pain, OCD, and movement disorders in various models.

00:00
00:00
~3 min • Beginner • English
Abstract
Studying naturalistic animal behavior remains a difficult objective. Recent machine learning advances have enabled limb localization; however, extracting behaviors requires ascertaining the spatiotemporal patterns of these positions. To provide a link from poses to actions and their kinematics, we developed B-SOID – an open-source, unsupervised algorithm that identifies behavior without user bias. By training a machine classifier on pose pattern statistics clustered using new methods, our approach achieves greatly improved processing speed and the ability to generalize across subjects or labs. Using a frameshift alignment paradigm, B-SOID overcomes previous temporal resolution barriers. Using only a single, off-the-shelf camera, B-SOID provides categories of sub-action for trained behaviors and kinematic measures of individual limb trajectories in any animal model. These behavioral and kinematic measures are difficult but critical to obtain, particularly in the study of rodent and other models of pain, OCD, and movement disorders.
Publisher
Nature Communications
Published On
Aug 31, 2021
Authors
Alexander I. Hsu, Eric A. Yttri
Tags
animal behavior
B-SOID
machine learning
pose patterns
unsupervised algorithm
kinematic measures
behavior identification
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs, just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny