论文标题

PC-SNN:基于峰值神经网络中预测编码的本地Hebbian突触可塑性的监督学习

PC-SNN: Supervised Learning with Local Hebbian Synaptic Plasticity based on Predictive Coding in Spiking Neural Networks

论文作者

Lan, Mengting, Xiong, Xiaogang, Jiang, Zixuan, Lou, Yunjiang

论文摘要

事件驱动的尖峰神经网络(SNNS)被认为是第三代神经网络,结合了生物学知识的本地学习规则,因此有望为SNN构建低功率,神经形态硬件。但是,由于尖峰神经网络具有非线性和离散性,SNN的培训仍然很困难,并且仍在讨论中。 Backprop起源于梯度下降,在多层SNN中取得了惊人的成功。然而,假定它缺乏生物学上的合理性,同时消耗了相对较高的计算资源。在本文中,我们提出了一种受预测编码理论启发的新型学习算法,并表明它可以仅利用本地HEBBIAN可塑性,可以完全自主和成功地进行监督学习。此外,与最先进的多层SNNS相比,这种方法的性能是一种有利的性能:Caltech Face/Motorbike数据集的测试准确性为99.25%,ETH-80数据集的84.25%,MNIST数据集的98.1%,MNIST数据集和98.5%的Neuromoromorphic dataSest:N-MN:N-MN。此外,我们的工作提供了一个新的观点,说明了在尖峰神经电路中如何直接实施监督的学习算法,这可能会为神经科学中的神经形态计算提供一些新的见解。

Deemed as the third generation of neural networks, the event-driven Spiking Neural Networks(SNNs) combined with bio-plausible local learning rules make it promising to build low-power, neuromorphic hardware for SNNs. However, because of the non-linearity and discrete property of spiking neural networks, the training of SNN remains difficult and is still under discussion. Originating from gradient descent, backprop has achieved stunning success in multi-layer SNNs. Nevertheless, it is assumed to lack biological plausibility, while consuming relatively high computational resources. In this paper, we propose a novel learning algorithm inspired by predictive coding theory and show that it can perform supervised learning fully autonomously and successfully as the backprop, utilizing only local Hebbian plasticity. Furthermore, this method achieves a favorable performance compared to the state-of-the-art multi-layer SNNs: test accuracy of 99.25% for the Caltech Face/Motorbike dataset, 84.25% for the ETH-80 dataset, 98.1% for the MNIST dataset and 98.5% for the neuromorphic dataset: N-MNIST. Furthermore, our work provides a new perspective on how supervised learning algorithms are directly implemented in spiking neural circuitry, which may give some new insights into neuromorphological calculation in neuroscience.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源