logo
ResearchBunny Logo
Abstract
This paper investigates the structural patterns in sparse recurrent neural networks (RNNs) that achieve performance comparable to their fully connected counterparts while requiring less energy and memory. The authors analyze RNNs trained using various sparsification strategies (pruning and rewiring) across different tasks. They discover a universal pattern of signed motifs, showing that optimized sparse topologies exhibit structural balance, improving performance. This structural balance is also observed in other state-of-the-art models, such as Neural ODEs and continuous-time RNNs, suggesting a general principle for optimizing sparse network architectures.
Publisher
Communications Physics
Published On
Sep 08, 2023
Authors
Xin-Jie Zhang, Jack Murdoch Moore, Gang Yan, Xiang Li
Tags
sparse recurrent neural networks
performance optimization
structural patterns
energy efficiency
memory reduction
sparsification strategies
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs—just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny