This paper investigates the structural patterns in sparse recurrent neural networks (RNNs) that achieve performance comparable to their fully connected counterparts while requiring less energy and memory. The authors analyze RNNs trained using various sparsification strategies (pruning and rewiring) across different tasks. They discover a universal pattern of signed motifs, showing that optimized sparse topologies exhibit structural balance, improving performance. This structural balance is also observed in other state-of-the-art models, such as Neural ODEs and continuous-time RNNs, suggesting a general principle for optimizing sparse network architectures.
Publisher
Communications Physics
Published On
Sep 08, 2023
Authors
Xin-Jie Zhang, Jack Murdoch Moore, Gang Yan, Xiang Li
Tags
sparse recurrent neural networks
performance optimization
structural patterns
energy efficiency
memory reduction
sparsification strategies
Related Publications
Explore these studies to deepen your understanding of the subject.