Learning delays through gradients and structure: emergence of spatiotemporal patterns in spiking neural networks

Front Comput Neurosci. 2024 Dec 20:18:1460309. doi: 10.3389/fncom.2024.1460309. eCollection 2024.

Abstract

We present a Spiking Neural Network (SNN) model that incorporates learnable synaptic delays through two approaches: per-synapse delay learning via Dilated Convolutions with Learnable Spacings (DCLS) and a dynamic pruning strategy that also serves as a form of delay learning. In the latter approach, the network dynamically selects and prunes connections, optimizing the delays in sparse connectivity settings. We evaluate both approaches on the Raw Heidelberg Digits keyword spotting benchmark using Backpropagation Through Time with surrogate gradients. Our analysis of the spatio-temporal structure of synaptic interactions reveals that, after training, excitation and inhibition group together in space and time. Notably, the dynamic pruning approach, which employs DEEP R for connection removal and RigL for reconnection, not only preserves these spatio-temporal patterns but outperforms per-synapse delay learning in sparse networks. Our results demonstrate the potential of combining delay learning with dynamic pruning to develop efficient SNN models for temporal data processing. Moreover, the preservation of spatio-temporal dynamics throughout pruning and rewiring highlights the robustness of these features, providing a solid foundation for future neuromorphic computing applications.

Keywords: delay learning; dynamic pruning; receptive field; sparse connectivity; spiking neural network.

Grants and funding

The author(s) declare financial support was received for the research, authorship, and/or publication of this article. This work was supported through a Doctoral Scholarship by the Leverhulme Trust and the EPSRC (EP/S030964/1, EP/V052241/1).