论文标题

尖峰校准:快速准确的尖峰神经网络进行对象检测和分割

Spike Calibration: Fast and Accurate Conversion of Spiking Neural Network for Object Detection and Segmentation

论文作者

Li, Yang, He, Xiang, Dong, Yiting, Kong, Qingqun, Zeng, Yi

论文摘要

尖峰神经网络(SNN)由于具有高生物学合理性和低能消耗在神经形态硬件上的特性而非常重要。作为获得深SNN的有效方法,转化方法在各种大型数据集上表现出高性能。但是,它通常遭受严重的性能降解和高时间延迟。特别是,以前的大多数工作都集中在简单的分类任务上,同时忽略了对ANN输出的精确近似。在本文中,我们首先从理论上分析转换误差,并得出时间变化极端对突触电流的有害影响。我们提出尖峰校准(Spicalib),以消除离散峰值对输出分布的损害,并修改脂质,以使任意的麦克斯普层无损地转换。此外,提出了针对最佳归一化参数的贝叶斯优化,以避免经验设置。实验结果证明了分类,对象检测和分割任务的最新性能。据我们所知,这是第一次获得与ANN同时在这些任务上相当的SNN。此外,我们在检测任务上只需要1/50的推理时间,并且可以在0.492 $ \ times $ $下在分段任务上实现相同的性能。

Spiking neural network (SNN) has been attached to great importance due to the properties of high biological plausibility and low energy consumption on neuromorphic hardware. As an efficient method to obtain deep SNN, the conversion method has exhibited high performance on various large-scale datasets. However, it typically suffers from severe performance degradation and high time delays. In particular, most of the previous work focuses on simple classification tasks while ignoring the precise approximation to ANN output. In this paper, we first theoretically analyze the conversion errors and derive the harmful effects of time-varying extremes on synaptic currents. We propose the Spike Calibration (SpiCalib) to eliminate the damage of discrete spikes to the output distribution and modify the LIPooling to allow conversion of the arbitrary MaxPooling layer losslessly. Moreover, Bayesian optimization for optimal normalization parameters is proposed to avoid empirical settings. The experimental results demonstrate the state-of-the-art performance on classification, object detection, and segmentation tasks. To the best of our knowledge, this is the first time to obtain SNN comparable to ANN on these tasks simultaneously. Moreover, we only need 1/50 inference time of the previous work on the detection task and can achieve the same performance under 0.492$\times$ energy consumption of ANN on the segmentation task.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源