Spike Timing Dependent Gradient for Direct Training of Fast and Efficient Binarized Spiking Neural Networks
Journal Publication ResearchOnline@JCUAbstract
Spiking neural networks (SNNs) are well-suited for neuromorphic hardware due to their biological plausibility and energy efficiency. These networks utilize sparse, asynchronous spikes for communication and can be binarized. However, the training of such networks presents several challenges due to their non-differentiable activation function and binarized inter-layer data movement. The well-established backpropagation through time (BPTT) algorithm used to train SNNs encounters notable difficulties because of its substantial memory consumption and extensive computational demands. These limitations restrict its practical utility in real-world scenarios. Therefore, effective techniques are required to train such networks efficiently while preserving accuracy. In this paper, we propose Binarized Spike Timing Dependent Gradient (BSTDG), a novel method that utilizes presynaptic and postsynaptic timings to bypass the non-differentiable gradient and the need of BPTT. Additionally, we employ binarized weights with a threshold training strategy to enhance energy savings and performance. Moreover, we exploit latency/temporal-based coding and the Integrate-and-Fire (IF) model to achieve significant computational advantages. We evaluate the proposed method on Caltech101 Face/Motorcycle, MNIST, Fashion-MNIST, and Spiking Heidelberg Digits. The results demonstrate that the accuracy attained surpasses that of existing BSNNs and single-spike networks under the same structure. Furthermore, the proposed model achieves up to 30 ××× speedup in inference and effectively reduces the number of spikes emitted in the hidden layer by 50% compared to previous works.
Journal
IEEE JOURNAL ON EMERGING AND SELECTED TOPICS IN CIRCUITS AND SYSTEMS
Publication Name
N/A
Volume
13
ISBN/ISSN
2156-3365
Edition
N/A
Issue
4
Pages Count
11
Location
N/A
Publisher
Institute of Electrical and Electronics Engineers
Publisher Url
N/A
Publisher Location
N/A
Publish Date
N/A
Url
N/A
Date
N/A
EISSN
N/A
DOI
10.1109/JETCAS.2023.3328926