论文标题
带有突发尖峰的尖峰神经网络的有效,准确的转换
Efficient and Accurate Conversion of Spiking Neural Network with Burst Spikes
论文作者
论文摘要
尖峰神经网络(SNN)是一种受脑启发的节能神经网络,引起了研究人员的兴趣。虽然培训尖峰神经网络仍然是一个空旷的问题。一种有效的方法是将训练有素的ANN的重量映射到SNN,以实现高推理能力。但是,转换后的尖峰神经网络通常会遭受性能降解和相当大的时间延迟。为了加快推理过程并获得更高的准确性,我们从三个角度理论上分析了转换过程中的错误:IF和RELU之间的差异,时间维度和汇总操作。我们提出了一个用于释放爆发尖峰的神经元模型,这是一种廉价但高效的方法来解决残差信息。此外,提出了横向抑制池(脂肪)来解决转换过程中最大化引起的不准确问题。 CIFAR和Imagenet的实验结果表明,我们的算法是有效而准确的。例如,我们的方法可以确保SNN几乎无损转换,并且仅在0.693 $ \ times $典型方法的能源消耗下使用约1/10(少于100)的仿真时间。我们的代码可在https://github.com/brain-inspired-cognitive-engine/conversion_burst上找到。
Spiking neural network (SNN), as a brain-inspired energy-efficient neural network, has attracted the interest of researchers. While the training of spiking neural networks is still an open problem. One effective way is to map the weight of trained ANN to SNN to achieve high reasoning ability. However, the converted spiking neural network often suffers from performance degradation and a considerable time delay. To speed up the inference process and obtain higher accuracy, we theoretically analyze the errors in the conversion process from three perspectives: the differences between IF and ReLU, time dimension, and pooling operation. We propose a neuron model for releasing burst spikes, a cheap but highly efficient method to solve residual information. In addition, Lateral Inhibition Pooling (LIPooling) is proposed to solve the inaccuracy problem caused by MaxPooling in the conversion process. Experimental results on CIFAR and ImageNet demonstrate that our algorithm is efficient and accurate. For example, our method can ensure nearly lossless conversion of SNN and only use about 1/10 (less than 100) simulation time under 0.693$\times$ energy consumption of the typical method. Our code is available at https://github.com/Brain-Inspired-Cognitive-Engine/Conversion_Burst.