Rectified Linear Postsynaptic Potential Function for Backpropagation in Deep Spiking Neural Networks

Malu Zhang, Jiadong Wang, Jibin Wu, Ammar Belatreche, Burin Amornpaisannon, Zhixuan Zhang, V. P. K. Miriyala, Hong Qu*, Yansong Chua, Trevor E. Carlson, Haizhou Li

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)
16 Downloads (Pure)


Spiking Neural Networks (SNNs) use spatiotemporal spike patterns to represent and transmit information, which are not only biologically realistic but also suitable for ultralow-power event-driven neuromorphic implementation. Just like other deep learning techniques, Deep Spiking Neural Networks (DeepSNNs) benefit from the deep architecture. However, the training of DeepSNNs is not straightforward because the wellstudied error back-propagation (BP) algorithm is not directly applicable. In this paper, we first establish an understanding as to why error back-propagation does not work well in DeepSNNs.
We then propose a simple yet efficient Rectified Linear Postsynaptic Potential function (ReL-PSP) for spiking neurons and a Spike-Timing-Dependent Back-Propagation (STDBP) learning algorithm for DeepSNNs where the timing of individual spikes is used to convey information (temporal coding), and learning (back-propagation) is performed based on spike timing in an event-driven manner. We show that DeepSNNs trained with the proposed single spike time-based learning algorithm can achieve state-of-the-art classification accuracy. Furthermore, by utilizing the trained model parameters obtained from the proposed STDBP learning algorithm, we demonstrate ultra-low power inference operations on a recently proposed neuromorphic inference accelerator. The experimental results also show that the neuromorphic hardware consumes 0.751 mW of the total power consumption and achieves a low latency of 47.71 ms to classify an image from the MNIST dataset. Overall, this work investigates the contribution of spike timing dynamics for information encoding, synaptic plasticity and decision making, providing a new perspective to the design of future DeepSNNs and neuromorphic hardware.
Original languageEnglish
Pages (from-to)1947-1958
Number of pages12
JournalIEEE Transactions on Neural Networks and Learning Systems
Issue number5
Early online date17 Sept 2021
Publication statusPublished - May 2022


Dive into the research topics of 'Rectified Linear Postsynaptic Potential Function for Backpropagation in Deep Spiking Neural Networks'. Together they form a unique fingerprint.

Cite this