logo
ResearchBunny Logo
Sleep-like unsupervised replay reduces catastrophic forgetting in artificial neural networks

Computer Science

Sleep-like unsupervised replay reduces catastrophic forgetting in artificial neural networks

T. Tadros, G. P. Krishnan, et al.

Discover how Timothy Tadros, Giri P. Krishnan, Ramyaa Ramyaa, and Maxim Bazhenov explore a groundbreaking approach to combat catastrophic forgetting in artificial neural networks. Their innovative 'Sleep Replay Consolidation' algorithm demonstrates a remarkable ability to recover lost knowledge through sleep-like dynamics, redefining our understanding of memory retention in machines.

00:00
00:00
~3 min • Beginner • English
Abstract
Artificial neural networks are known to suffer from catastrophic forgetting: when learning multiple tasks sequentially, they perform well on the most recent task at the expense of previously learned tasks. In the brain, sleep is known to play an important role in incremental learning by replaying recent and old conflicting memory traces. Here we tested the hypothesis that implementing a sleep-like phase in artificial neural networks can protect old memories during new training and alleviate catastrophic forgetting. Sleep was implemented as off-line training with local unsupervised Hebbian plasticity rules and noisy input. In an incremental learning framework, sleep was able to recover old tasks that were otherwise forgotten. Previously learned memories were replayed spontaneously during sleep, forming unique representations for each class of inputs. Representational sparseness and neuronal activity corresponding to the old tasks increased while new task related activity decreased. The study suggests that spontaneous replay simulating sleep-like dynamics can alleviate catastrophic forgetting in artificial neural networks.
Publisher
Nature Communications
Published On
Dec 15, 2022
Authors
Timothy Tadros, Giri P. Krishnan, Ramyaa Ramyaa, Maxim Bazhenov
Tags
artificial neural networks
catastrophic forgetting
sleep phase
memory retention
Hebbian plasticity
learning
neuroscience
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs, just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny