Research Statement

Overview

My research has focussed on Machine Learning in the context of Brain-Inspired or Neuromorphic Computing, where I have made hardware  [13], software  [4], algorithmic  [5], as well as conceptual contributions  [57].

Event-based backpropagation

I have derived, in collaboration with Timo Wunderlich, the analogue of the backpropagation algorithm for continuous time spiking neural networks  [5] – EventProp. It computes exact parameter gradients in networks of spiking neurons with no restrictions on network topology or loss function. Previous work had either found solutions for particular cases or considered finding exact parameter gradients impossible due to the discontinuous nature of spike transitions. Notably, the algorithm can be efficiently implemented using an event-based simulation algorithm and on digital neuromorphic hardware, requiring only temporally sparse communication during the backward phase. Furthermore, the memory requirements are proportional to the number of communicated spikes resulting in order of magnitude improvement relative to previous approaches. These properties also strongly suggest an energy-efficient implementation in next-generation analog neuromorphic hardware is possible.

In ongoing work, we have demonstrated that the algorithm can estimate parameter gradients by observing only spikes from an analog Neuromorphic Hardware emulating a known spiking neural network  [8]. We match a previous approach  [9] in performance (on tasks we have evaluated so far) while not requiring densely sampled membrane-voltage traces. Such dense observation of the system state is far less information efficient and would be prohibitive for large-scale systems.

While EventProp cannot be considered biologically plausible, I believe it to be one key enabling “technology” for NeuroAI  [10] – analogous to how the backpropagation algorithm was foundational to the current rapid progress in Machine Learning – once it is combined with further insights from neuroscience, such as connectivity and neuron dynamics, as well as the large scale electro-physiological and connectivity data.

There also exists an analog of real-time recurrent learning (RTRL)  [11] for spiking neural networks and corresponding approximate truncations, which I derived in my thesis  [12], but have yet to evaluate experimentally.

Correlations in Spiking Neural Networks

I’ve investigated, in collaboration with Christof Wetterich, how correlations – fundamental both to quantum mechanics and potentially computation in biological systems – can be learned in networks of spiking neurons  [7]. One main contribution of this work was a first demonstration that correlations can approximate specific low-dimensional quantum density matrices in networks of spiking neurons. Another contribution is the demonstration that it is possible to implement neural sampling using an end-to-end learning approach without any prior assumptions on the underlying equilibrium probability distribution. Further research in this direction could yield a practical way to approximate a restricted set of density matrices, for example, ground states of particular quantum spin systems, or to perform quantum state tomography. It also could inform theoretical tools for studying correlations in spiking neural networks.

Hardware Design and Plasticity Experiments

I was part of the design team of an analog Neuromorphic Processor  [1] – BrainScaleS-2. In particular, I was responsible for the scaling up and verification of the “plasticity processing unit” (an embedded processor with a single-instruction-multiple data unit and parallel access to analog system observables). The purpose of this processor is to enable “hybrid plasticity,” that is, the flexible implementation of bio-inspired learning rules that can directly interact with the analog emulation of neuron dynamics  [13]. Since the analog components have time constants that are approximately 10^3 faster than the time constants of biological neuron voltage dynamics, it enables the rapid evaluation of plasticity rules over long biological timescales, as well as the use of evolutionary algorithms. I evaluated and designed plasticity experiments and extended the digital implementation based on user requirements. I interacted with both computational and experimental neuroscientists to determine which current computational approaches to bio-plausible learning and plasticity could be realized. My main contributions were: To suggest the implementation of meta-plasticity (implemented by executing a small ANN on the plasticity processing unit, whose weights were optimized using evolutionary strategies) and contributing to a demonstration of Learning-to-Learn on Neuromorphic Hardware  [6]. To implement a memory interface, which enabled hardware-in-the-loop learning based on membrane trace measurements  [9]. To implement spike routing and I/O for a prototype system, which enabled closed-loop experiments  [2] and a first demonstration of R-STDP reinforcement learning  [14].

Convenient SNN training compatible with Deep Learning

I created and co-develop one of the first software libraries, which allows non-experts to train and simulate spiking neural networks in a way that is readily interoperable with common concepts from deep learning – Norse  [4]. Prior work had either used programming abstractions taken from neuron simulators, like populations and projections or was tightly coupled to implementation choices of a specific publication, with little chance of reuse. Norse has now been adopted by several groups and is in active use.


[1]
Christian Pehle, S. Billaudelle, B. Cramer, J. Kaiser, K. Schreiber, Y. Stradmann, J. Weis, A. Leibfried, E. Müller, and J. Schemmel, The BrainScaleS-2 Accelerated Neuromorphic System with Hybrid Plasticity, Frontiers in Neuroscience 16, (2022).
[2]
K. Schreiber, T. C. Wunderlich, C. Pehle, M. A. Petrovici, J. Schemmel, and K. Meier, Closed-Loop Experiments on the BrainScaleS-2 Architecture, in Proceedings of the Neuro-Inspired Computational Elements Workshop (Association for Computing Machinery, Heidelberg, Germany, 2020).
[3]
S. A. Aamir, Y. Stradmann, P. Müller, Christian Pehle, A. Hartel, A. Grübl, J. Schemmel, and K. Meier, An Accelerated Lif Neuronal Network Array for a Large-Scale Mixed-Signal Neuromorphic Architecture, IEEE Transactions on Circuits and Systems I: Regular Papers 65, 4299 (2018).
[4]
Christian Pehle and J. E. Pedersen, Norse - A deep learning library for spiking neural networks (2021).
[5]
T. C. Wunderlich and Christian Pehle, Event-Based Backpropagation Can Compute Exact Gradients for Spiking Neural Networks, Scientific Reports 11, 1 (2021).
[6]
T. Bohnstingl, F. Scherr, Christian Pehle, K. Meier, and W. Maass, Neuromorphic Hardware Learns to Learn, Frontiers in Neuroscience 13, 483 (2019).
[7]
Christian Pehle and C. Wetterich, Neuromorphic Quantum Computing, Phys. Rev. E 106, 045311 (2022).
[8]
Christian Pehle, L. Blessing, E. Arnold, E. Müller, and J. Schemmel, Event-Based Backpropagation for Analog Neuromorphic Hardware, In Preparation (2022).
[9]
B. Cramer, S. Billaudelle, S. Kanya, A. Leibfried, A. Grübl, V. Karasenko, Christian Pehle, K. Schreiber, Y. Stradmann, J. Weis, and others, Surrogate Gradients for Analog Neuromorphic Computing, Proceedings of the National Academy of Sciences 119, e2109194119 (2022).
[10]
A. Zador, B. Richards, B. Ölveczky, S. Escola, Y. Bengio, K. Boahen, M. Botvinick, D. Chklovskii, A. Churchland, C. Clopath, and others, Toward Next-Generation Artificial Intelligence: Catalyzing the Neuroai Revolution, arXiv Preprint arXiv:2210.08340 (2022).
[11]
R. J. Williams and D. Zipser, Experimental Analysis of the Real-Time Recurrent Learning Algorithm, Connection Science 1, 87 (1989).
[12]
C.-G. Pehle, Adjoint Equations of Spiking Neural Networks, PhD thesis, 2021.
[13]
S. Friedmann, J. Schemmel, A. Grübl, A. Hartel, M. Hock, and K. Meier, Demonstrating Hybrid Learning in a Flexible Neuromorphic Hardware System, IEEE Transactions on Biomedical Circuits and Systems 11, 128 (2016).
[14]
T. Wunderlich, A. F. Kungl, E. Müller, A. Hartel, Y. Stradmann, S. A. Aamir, A. Grübl, A. Heimbrecht, K. Schreiber, D. Stöckel, Christian Pehle, and others, Demonstrating Advantages of Neuromorphic Computation: A Pilot Study, Frontiers in Neuroscience 13, 260 (2019).