The cognitive capabilities of the human brain have long inspired the research and development of novel computing paradigms, promising self-adapting, fast, and energy-efficient computation. In this thesis, we address some of the challenges that arise on the road to biology-inspired processing, and to that end propose novel transistor-level electronic circuits as well as algorithms for brain-inspired learning in neuromorphic systems.
We, first, present a collection of neuron circuits for \bss2, a neuromorphic platform capturing some of the nervous system's key properties in integrated analog electronics. Our silicon neurons accurately emulate the dynamics of the underlying model equations at 1000-fold accelerated time scales and are capable of precisely replicating the rich dynamics and firing patterns observed in electrophysiological recordings. They furthermore serve as a building block for functional networks. We discuss two different approaches for learning in such biology-inspired systems:
Inspired by the tremendous progress of machine learning, we introduce a framework for gradient-based optimization of neuromorphic spiking neural networks. By attaching gradients to the internal, physically evolving dynamics and exposing them to automatic differentiation frameworks, our approach maintains a very general scope and can be applied to arbitrary loss functions and feedforward as well as recurrent topologies. We demonstrate the capabilities of both our neuromorphic circuits and the training framework by solving inference tasks on imaging and natural language datasets. In that process, we exploit temporally sparse coding schemes as well as the accelerated emulation of neural dynamics to achieve energy-efficient inference at low classification latencies.
We further propose a biology-inspired mechanism for structural plasticity as well as an accompanying on-chip implementation on \bss2. The plasticity scheme enables the self-organized formation of receptive fields in a sparse network graph and allows learning with constrained resources.
The present work thus combines the bottom up approach of drawing inspiration from the nervous system's key architectural properties with the top-down strategy of adopting successful methods from machine learning. This allowed us to address some of the most challenging topics of analog brain-inspired computing. |