The last years have seen many exciting new developments to train spiking neural networks to perform complex information processing. This online workshop brings together researchers in the field to present their work and discuss ways of translating these findings into a better understanding of neural circuits. Topics include artificial and biologically plausible learning algorithms and the dissection of trained spiking circuits toward understanding neural processing. We have a manageable number of talks with ample time for discussions.

The workshop is being organised by Dan Goodman and Friedemann Zenke.

Registration

Registration is now closed. You can see a replay of the talks on Crowdcast or the talks and one of the recorded discussions on YouTube. Note that Claudia Clopath’s talk and the day 2 discussion were not recorded.

Agenda

Talks will be 45m (30m + 15m questions/discussion). Hover over the talk titles to see abstracts (if one is available).

August 31st

Time (CET) Session Local date/time
14:00 Welcome by organizers
14:10 Sander Bohte (CWI)
Effective and Efficient Computation with Multiple-timescale Spiking Recurrent Neural Networks
14:55 Iulia M. Comsa (Google Research)
On temporal coding in spiking neural networks with alpha synaptic function
15:40 Break (30mins)
16:10 Franz Scherr (TUG)
E-prop: A biologically inspired paradigm for learning in recurrent networks of spiking neurons
16:55 Emre Neftci (UC Irvine)
Synthesizing Machine Intelligence in Neuromorphic Computers with Differentiable Programming
17:40 Break (30mins)
18:10 Discussion (can continue as long as needed)
Current technical constraints and bottlenecks.
We can train spiking neural networks. What now?

September 1st

Time (CET) Session Local date/time
14:00 Welcome by organizers
14:10 Timothee Masquelier (CNRS Toulouse)
Back-propagation in spiking neural networks
14:55 Claudia Clopath (Imperial College)
Training spiking neurons with FORCE to uncover hippocampal function
15:40 Break (30mins)
16:10 Richard Naud (U Ottawa)
Burst-dependent synaptic plasticity can coordinate learning in hierarchical circuits
16:55 Julian Goeltz (Uni Bern)
Fast and deep neuromorphic learning with time-to-first-spike coding
17:40 Break (30mins)
18:10 Discussion (can continue as long as needed)
Why spiking?

Discussion topics

Monday, August 31

We will discuss two topics:

Current technical constraints and bottlenecks.

How large do we need SNNs to be and how do we train them to do our research? What are the next steps the field should take to swiftly move forward?

We can train spiking neural networks. What now?

How do we leverage this methodological advance to gain a better understanding of how the brain processes information? What constitutes a conceptual advance? And how do we compare trained spiking neural networks to biology?

Tuesday, September 1

Why spiking?

Neurons communicate via precisely timed, discrete pulses rather than by analogue signals. Why? Is there a computational advantage to this mode of communication, or is just to save energy? With the recent advances in our ability to train spiking neural networks discussed in this workshop, can we throw new light on this age old discussion, and outline a programme to resolve it?