Full Integer Arithmetic Online Training for Spiking Neural Networks
Abstract
Spiking Neural Networks (SNNs) are promising for neuromorphic computing due to their biological plausibility and energy efficiency. However, training methods like Backpropagation Through Time (BPTT) and Real Time Recurrent Learning (RTRL) remain computationally intensive. This work introduces an integer-only, online training algorithm using a mixed-precision approach to improve efficiency and reduce memory usage by over 60%. The method replaces floating-point operations with integer arithmetic to enable hardware-friendly implementation. It generalizes to Convolutional and Recurrent SNNs (CSNNs, RSNNs), showing versatility across architectures. Evaluations on MNIST and the Spiking Heidelberg Digits (SHD) dataset demonstrate that mixed-precision models achieve accuracy comparable to or better than full-precision baselines using 16-bit shadow and 8- or 12-bit inference weights. Despite some limitations in low-precision and deeper models, performance remains robust. In conclusion, the proposed integer-only online learning algorithm presents an effective solution for efficiently training SNNs, enabling deployment on resource-constrained neuromorphic hardware without sacrificing accuracy.