logo
ResearchBunny Logo
Abstract
Artificial neural networks (ANNs) suffer from catastrophic forgetting—when learning multiple tasks sequentially, they perform well on the most recent task at the expense of previously learned tasks. This study tests the hypothesis that implementing a sleep-like phase in ANNs can alleviate this. Sleep was implemented as offline training with local unsupervised Hebbian plasticity rules and noisy input. Results show that this 'Sleep Replay Consolidation' (SRC) algorithm recovered old tasks otherwise forgotten, replaying memories spontaneously during the sleep phase. Increased representational sparseness and neuronal activity related to old tasks, while decreasing new task activity, suggest that sleep-like dynamics can mitigate catastrophic forgetting.
Publisher
Nature Communications
Published On
Dec 15, 2022
Authors
Timothy Tadros, Giri P. Krishnan, Ramyaa Ramyaa, Maxim Bazhenov
Tags
artificial neural networks
catastrophic forgetting
sleep phase
memory retention
Hebbian plasticity
learning
neuroscience
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs—just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny