论文标题

神经建筑搜索尖峰神经网络

Neural Architecture Search for Spiking Neural Networks

论文作者

Kim, Youngeun, Li, Yuhang, Park, Hyoungseob, Venkatesha, Yeshwanth, Panda, Priyadarshini

论文摘要

尖峰神经网络(SNN)由于其固有的高表象激活而引起了传统人工神经网络(ANN)的势能效率替代品。但是,大多数先前的SNN方法都使用类似Ann的体系结构(例如VGG-NET或RESNET),这可以为SNN中二进制信息的时间序列处理提供次优性能。为了解决这个问题,在本文中,我们介绍了一种新型的神经体系结构搜索(NAS),以找到更好的SNN体系结构。受到最新的NAS方法的启发,从初始化时发现激活模式的最佳体系结构,我们选择了可以代表不同数据样本的各种尖峰激活模式的体系结构,而无需训练。此外,为了进一步利用尖峰之间的时间信息,我们在层之间搜索馈送正向连接以及向后连接(即时间反馈连接)。有趣的是,我们的搜索算法发现的SNASNET通过向后连接实现了更高的性能,这表明了设计SNN体系结构以适当使用时间信息的重要性。我们对三个图像识别基准进行了广泛的实验,我们表明SNASNET可以实现最先进的性能,而时间段明显较低(5个时间段)。代码可在GitHub上找到。

Spiking Neural Networks (SNNs) have gained huge attention as a potential energy-efficient alternative to conventional Artificial Neural Networks (ANNs) due to their inherent high-sparsity activation. However, most prior SNN methods use ANN-like architectures (e.g., VGG-Net or ResNet), which could provide sub-optimal performance for temporal sequence processing of binary information in SNNs. To address this, in this paper, we introduce a novel Neural Architecture Search (NAS) approach for finding better SNN architectures. Inspired by recent NAS approaches that find the optimal architecture from activation patterns at initialization, we select the architecture that can represent diverse spike activation patterns across different data samples without training. Moreover, to further leverage the temporal information among the spikes, we search for feed forward connections as well as backward connections (i.e., temporal feedback connections) between layers. Interestingly, SNASNet found by our search algorithm achieves higher performance with backward connections, demonstrating the importance of designing SNN architecture for suitably using temporal information. We conduct extensive experiments on three image recognition benchmarks where we show that SNASNet achieves state-of-the-art performance with significantly lower timesteps (5 timesteps). Code is available at Github.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源