论文标题

RMP-SNN:残留的膜电位神经元,以实现更深的高临界性和低延迟峰值神经网络

RMP-SNN: Residual Membrane Potential Neuron for Enabling Deeper High-Accuracy and Low-Latency Spiking Neural Network

论文作者

Han, Bing, Srinivasan, Gopalakrishnan, Roy, Kaushik

论文摘要

尖峰神经网络(SNN)最近引起了人们的重大研究兴趣,作为第三代人工神经网络,可以实现低功率事件驱动的数据分析。通过将训练有素的模拟神经网络(ANN)转换为由整流的线性单元(RELU)组成的训练有素的模拟神经网络(ANN),获得了图像识别任务的最佳性能SNN,并由由具有“正确”触发阈值的集成和开火神经元组成。与原始ANN相比,转换后的SNN通常会产生准确性损失,并且需要相当数量的推理时间步骤才能达到最佳精度。我们发现,转换后的SNN中的性能降解源于使用“硬重置”尖峰神经元,该神经元一旦其膜电位超过点火阈值而被固定重置电势,从而导致SNN推理期间的信息损失。我们提出了使用“软复位”尖峰神经元模型的ANN-SNN转换,称为残留膜电位(RMP)尖峰神经元,该神经元在射击瞬间保留了高于阈值的“残留”膜电位。我们使用RMP神经元进行VGG-16,RESNET-20和RESNET-34 SNN在包括CIFAR-10(CIFAR-10(93.63%top-1),CIFAR-100(70.93%TOP-1)和Imagenet(73.09%top-1精度)上,我们使用RMP神经元进行了接近无损失的ANN-SNN转换。我们的结果还表明,RMP-SNN超过了使用网络体系结构和数据集中的推理时间步长2-8倍,使用“硬重置”尖峰神经元提供了转换后的SNN提供的最佳推理精度。

Spiking Neural Networks (SNNs) have recently attracted significant research interest as the third generation of artificial neural networks that can enable low-power event-driven data analytics. The best performing SNNs for image recognition tasks are obtained by converting a trained Analog Neural Network (ANN), consisting of Rectified Linear Units (ReLU), to SNN composed of integrate-and-fire neurons with "proper" firing thresholds. The converted SNNs typically incur loss in accuracy compared to that provided by the original ANN and require sizable number of inference time-steps to achieve the best accuracy. We find that performance degradation in the converted SNN stems from using "hard reset" spiking neuron that is driven to fixed reset potential once its membrane potential exceeds the firing threshold, leading to information loss during SNN inference. We propose ANN-SNN conversion using "soft reset" spiking neuron model, referred to as Residual Membrane Potential (RMP) spiking neuron, which retains the "residual" membrane potential above threshold at the firing instants. We demonstrate near loss-less ANN-SNN conversion using RMP neurons for VGG-16, ResNet-20, and ResNet-34 SNNs on challenging datasets including CIFAR-10 (93.63% top-1), CIFAR-100 (70.93% top-1), and ImageNet (73.09% top-1 accuracy). Our results also show that RMP-SNN surpasses the best inference accuracy provided by the converted SNN with "hard reset" spiking neurons using 2-8 times fewer inference time-steps across network architectures and datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源