logo
ResearchBunny Logo
Learning noise-induced transitions by multi-scaling reservoir computing

Interdisciplinary Studies

Learning noise-induced transitions by multi-scaling reservoir computing

Z. Lin, Z. Lu, et al.

Discover how Zequn Lin, Zhaofan Lu, Zengru Di, and Ying Tang leverage reservoir computing to unveil noise-induced transitions in dynamic systems. Their innovative multi-scaling approach reveals hidden patterns in noisy data, offering superior insights into stochastic transitions and specific transition times, far surpassing traditional techniques.

00:00
00:00
Playback language: English
Abstract
This paper explores the use of reservoir computing, a machine learning model, to learn noise-induced transitions in dynamical systems. The authors propose a multi-scaling approach that leverages a key hyperparameter to control the timescale of the reservoir dynamics, allowing it to effectively capture slow transitions amidst fast-scale noise. The method is demonstrated on various systems (bistable with white/colored noise, multistable) and experimental protein folding data, showing superior performance compared to conventional techniques like SINDy and recurrent neural networks in capturing stochastic transition statistics and even specific transition times for colored noise. The approach demonstrates potential for analyzing noisy time series beyond simply noise reduction.
Publisher
Nature Communications
Published On
Aug 03, 2024
Authors
Zequn Lin, Zhaofan Lu, Zengru Di, Ying Tang
Tags
reservoir computing
stochastic transitions
dynamical systems
multi-scaling approach
noise reduction
machine learning
transition statistics
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs, just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny