Navigating Local Minima in Quantized Spiking Neural Networks
Conference Publication ResearchOnline@JCUAbstract
Spiking and Quantized Neural Networks (NNs) are becoming exceedingly important for hyper-efficient implementations of Deep Learning (DL) algorithms. However, these networks face challenges when trained using error backpropagation, due to the absence of gradient signals when applying hard thresholds. The broadly accepted trick to overcoming this is through the use of biased gradient estimators: surrogate gradients which approximate thresholding in Spiking Neural Networks (SNNs), and Straight-Through Estimators (STEs), which completely by-pass thresholding in Quantized Neural Networks (QNNs). While noisy gradient feedback has enabled reasonable performance on simple supervised learning tasks, it is thought that such noise increases the difficulty of finding optima in loss landscapes, especially during the later stages of optimization. By periodically boosting the Learning Rate (LR) during training, we expect the network can navigate unexplored solution spaces that would otherwise be difficult to reach due to local minima, barriers, or flat surfaces. This paper presents a systematic evaluation of a cosine-annealed LR schedule coupled with weight-independent adaptive moment estimation as applied to Quantized SNNs (QSNNs). We provide a rigorous empirical evaluation of this technique on high precision and 4-bit quantized SNNs across three datasets, demonstrating state-of-the-art performance on the more complex datasets. Our source code is available at this link: https://github.com/jeshraghian/QSNNs.
Journal
N/A
Publication Name
Proceeding - IEEE International Conference on Artificial Intelligence Circuits and Systems, AICAS 2022
Volume
N/A
ISBN/ISSN
9781665409964
Edition
N/A
Issue
N/A
Pages Count
4
Location
Incheon, Republic of Korea
Publisher
Institute of Electrical and Electronics Engineers
Publisher Url
N/A
Publisher Location
Piscataway, NJ, USA
Publish Date
N/A
Url
N/A
Date
N/A
EISSN
N/A
DOI
10.1109/AICAS54282.2022.9869966