Artificial neural networks (ANNs) suffer from catastrophic forgetting—when learning multiple tasks sequentially, they perform well on the most recent task at the expense of previously learned tasks. This study tests the hypothesis that implementing a sleep-like phase in ANNs can alleviate this. Sleep was implemented as offline training with local unsupervised Hebbian plasticity rules and noisy input. Results show that this 'Sleep Replay Consolidation' (SRC) algorithm recovered old tasks otherwise forgotten, replaying memories spontaneously during the sleep phase. Increased representational sparseness and neuronal activity related to old tasks, while decreasing new task activity, suggest that sleep-like dynamics can mitigate catastrophic forgetting.
Publisher
Nature Communications
Published On
Dec 15, 2022
Authors
Timothy Tadros, Giri P. Krishnan, Ramyaa Ramyaa, Maxim Bazhenov
Tags
artificial neural networks
catastrophic forgetting
sleep phase
memory retention
Hebbian plasticity
learning
neuroscience
Related Publications
Explore these studies to deepen your understanding of the subject.