Jahr | 2022 |
Autor(en) | Benjamin Cramer and Sebastian Billaudelle and Simeon Kanya and Aron Leibfried and Grübl, Andreas and Vitali Karasenko and Christian Pehle and Korbinian Schreiber and Yannik Stradmann and Johannes Weis and Johannes Schemmel and Friedemann Zenke |
Titel | Surrogate gradients for analog neuromorphic computing |
KIP-Nummer | HD-KIP 22-03 |
KIP-Gruppe(n) | F9 |
Dokumentart | Paper |
Quelle | Proceedings of the National Academy of Sciences 119 (2022) |
doi | 10.1073/pnas.2109194119 |
Abstract (en) | Neuromorphic systems aim to accomplish efficient computation in electronics by mirroring neurobiological principles. Taking advantage of neuromorphic technologies requires effective learning algorithms capable of instantiating high-performing neural networks, while also dealing with inevitable manufacturing variations of individual components, such as memristors or analog neurons. We present a learning framework resulting in bioinspired spiking neural networks with high performance, low inference latency, and sparse spike-coding schemes, which also self-corrects for device mismatch. We validate our approach on the BrainScaleS-2 analog spiking neuromorphic system, demonstrating state-of-the-art accuracy, low latency, and energy efficiency. Our work sketches a path for building powerful neuromorphic processors that take advantage of emerging analog technologies.To rapidly process temporal information at a low metabolic cost, biological neurons integrate inputs as an analog sum, but communicate with spikes, binary events in time. Analog neuromorphic hardware uses the same principles to emulate spiking neural networks with exceptional energy efficiency. However, instantiating high-performing spiking networks on such hardware remains a significant challenge due to device mismatch and the lack of efficient training algorithms. Surrogate gradient learning has emerged as a promising training strategy for spiking networks, but its applicability for analog neuromorphic systems has not been demonstrated. Here, we demonstrate surrogate gradient learning on the BrainScaleS-2 analog neuromorphic system using an in-the-loop approach. We show that learning self-corrects for device mismatch, resulting in competitive spiking network performance on both vision and speech benchmarks. Our networks display sparse spiking activity with, on average, less than one spike per hidden neuron and input, perform inference at rates of up to 85,000 frames per second, and consume less than 200 mW. In summary, our work sets several benchmarks for low-energy spiking network processing on analog neuromorphic hardware and paves the way for future on-chip learning algorithms.There are no data underlying this work. Driver software and code examples are available in GitHub (https://github.com/fmi-basel/brainscales-2-surrogate-gradients). |
bibtex | @article{cramer2022surrogate, author = {Cramer, Benjamin and Billaudelle, Sebastian and Kanya, Simeon and Leibfried, Aron and Gr{\"u}bl, Andreas and Karasenko, Vitali and Pehle, Christian and Schreiber, Korbinian and Stradmann, Yannik and Weis, Johannes and Schemmel, Johannes and Zenke, Friedemann}, title = {Surrogate gradients for analog neuromorphic computing}, journal = {Proceedings of the National Academy of Sciences}, year = {2022}, volume = {119}, number = {4}, elocation-id = {e2109194119}, pages = {}, doi = {10.1073/pnas.2109194119} } |
URL | cramere2022surrogate |
Datei | |
URL | https://www.pnas.org/content/119/4/e2109194119.full |