logo
ResearchBunny Logo
Online dynamical learning and sequence memory with neuromorphic nanowire networks

Physics

Online dynamical learning and sequence memory with neuromorphic nanowire networks

R. Zhu, S. Lilak, et al.

Discover the groundbreaking advancements in online learning from spatio-temporal dynamical features with remarkable accuracy achieved by the researchers, Ruomin Zhu, Sam Lilak, Alon Loeffler, Joseph Lizier, Adam Stieg, James Gimzewski, and Zdenka Kuncic. Their work shows how dynamic learning not only enhances image classification but also enables effective sequence memory recall, unlocking the potential of memory in learning.

00:00
00:00
Playback language: English
Introduction
Neuromorphic computing aims to mimic the brain's efficiency. While CMOS-based neuromorphic hardware is prevalent, an alternative approach leverages the brain-like properties of nanoscale materials. This research focuses on memristive nanowire networks (NWNs), composed of metal nanowires forming a heterogeneous network with synapse-like memristive switching at junctions. NWNs exhibit collective dynamics like phase transitions and avalanches, making them suitable for processing temporal signals. Reservoir computing (RC) is a machine learning approach that uses the rich dynamics of a recurrent network (the reservoir) to extract features, simplifying training to a linear output layer. NWNs, with their self-regulating dynamics and physical constraints, are effective physical reservoirs. Previous studies used batch-based learning, where training occurs after the entire input stream is processed. This limits scalability and adaptability to evolving feature distributions. Online training, where weights adapt incrementally, addresses these limitations, enabling continual learning. This study employs an NWN device to demonstrate online dynamical learning using the MNIST handwritten digit database and a novel sequence memory task, revealing the interplay between online learning and memory.
Literature Review
Existing neuromorphic computing approaches include CMOS-based systems implementing spike-based neural networks and those utilizing the inherent properties of nanoscale materials. Memristive devices, exhibiting resistive memory switching, are particularly promising. Previous research demonstrated NWN's capability for temporal learning tasks. However, online learning from spatio-temporal dynamics using NWNs remained unexplored, particularly for complex tasks like MNIST classification. Batch-based RC approaches have limitations in handling large datasets and adapting to non-stationary features. Online learning methods, while common in conventional machine learning, needed to be applied to neuromorphic systems for improved scalability and continual learning capabilities. The authors aim to bridge this gap by applying online learning to an NWN for MNIST digit classification and a novel sequence memory task.
Methodology
The researchers used an NWN device fabricated using a multi-electrode array (MEA) with 16 electrodes. Selenium nanowires were hydrothermally reduced, and Ag₂Se nanowires were synthesized and drop-casted onto the MEA to create the NWN. For MNIST digit classification, MNIST images were converted into 1D temporal voltage pulse streams and delivered sequentially to an input electrode. Readout voltages from other electrodes captured the network's response. An online recursive least squares (RLS) algorithm trained a linear classifier externally, updating weights after each sample. The number of readout channels (1 and 5 were tested) influenced the feature dimensionality. For comparison, batch learning using backpropagation was also performed. For the sequence memory task, a semi-repetitive sequence of MNIST digits was presented. The network's conductance, in addition to readout voltages, provided memory features. Using a sliding memory window, the first digit in the sequence was reconstructed from features of subsequent digits. Image reconstruction quality was assessed using the Structural Similarity Index Measure (SSIM). Mutual information (MI) was calculated to quantify the information content of the NWN readouts, relating it to classification accuracy. The impact of memory on learning was examined by excluding conductance features, replacing them with voltage readouts, and observing the resulting SSIM.
Key Findings
The online dynamical learning approach using the NWN device achieved a 93.4% accuracy on the MNIST handwritten digit classification task using 5 readout channels, outperforming the batch method (91.4%). Accuracy increased with the number of readout channels, reflecting the richness of dynamical features. The learning rate, reflected in the magnitude of weight matrix changes, peaked around 10²–10³ samples, coinciding with the saturation of mutual information (MI) between digit classes and NWN readouts. MI analysis showed distinct information content extracted from different channels for different digit classes. The sequence memory task showed successful reconstruction of digits from a sequence based on memory patterns in conductance readouts. The SSIM for image reconstruction increased with the memory window length, demonstrating the role of memory in recall. Excluding conductance memory features led to a decrease in reconstruction quality, confirming memory's contribution to online learning. The results show that digit classes with higher MI values tended to exhibit higher classification accuracy, indicating a direct link between information dynamics and learning performance. The study also shows evidence of channel preference for certain digit classes.
Discussion
The high classification accuracy achieved by online learning using the NWN highlights the potential of these devices for efficient and adaptive machine learning. The observed correlation between MI and classification accuracy demonstrates a strong connection between information dynamics and learning performance. The sequence memory task showcases the NWN's ability to embed and recall temporal patterns, offering parallels to the brain's memory mechanisms. The success of this research suggests the potential for all-analogue hardware implementations for efficient online dynamical learning. These findings demonstrate a move beyond traditional data-driven machine learning, utilizing the intrinsic dynamics of the physical system itself for computation.
Conclusion
This study successfully demonstrated online dynamical learning and sequence memory recall using an NWN device for the MNIST digit classification and a novel sequence memory task. The results show high classification accuracy, a clear link between information dynamics and learning, and the crucial role of memory in enhancing learning and recall. The work opens avenues for all-analogue hardware implementations of online learning and suggests potential applications in areas like natural language processing and image analysis. Future work could investigate the optimization of channel selection and explore more complex sequence learning tasks.
Limitations
The study focused on a specific NWN device fabrication and experimental setup. The generalizability of the findings to other NWN architectures and materials requires further investigation. The sequence memory task used a relatively simple, semi-repetitive sequence. More complex and less predictable sequences may reveal additional challenges for the NWN. The external linear classifier limits the system's potential compared to fully integrated, in-materio computation.
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs, just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny