论文标题
控制二进制神经网络的信息能力
Controlling Information Capacity of Binary Neural Network
论文作者
论文摘要
尽管深度学习技术的普及越来越普及,但高内存需求和功耗实质上却限制了它们在移动和物联网领域的应用。尽管二进制卷积网络可以减轻这些问题,但权重有限通常会导致预测准确性的显着降解。在本文中,我们提出了一种训练二进制网络的方法,该方法通过将基于香农熵的罚款应用于卷积过滤器,在整个培训过程中保持稳定的预定水平。在SVHN,CIFAR和Imagenet数据集上进行的实验结果表明,所提出的方法在统计学上可以显着提高二元网络的准确性。
Despite the growing popularity of deep learning technologies, high memory requirements and power consumption are essentially limiting their application in mobile and IoT areas. While binary convolutional networks can alleviate these problems, the limited bitwidth of weights is often leading to significant degradation of prediction accuracy. In this paper, we present a method for training binary networks that maintains a stable predefined level of their information capacity throughout the training process by applying Shannon entropy based penalty to convolutional filters. The results of experiments conducted on SVHN, CIFAR and ImageNet datasets demonstrate that the proposed approach can statistically significantly improve the accuracy of binary networks.