论文标题

E2FIF:使用端到端完整的信息流,按二进制深度图像超分辨率的极限

E2FIF: Push the limit of Binarized Deep Imagery Super-resolution using End-to-end Full-precision Information Flow

论文作者

Lang, Zhiqiang, Song, Chongxing, Zhang, Lei, Wei, Wei

论文摘要

二进制神经网络(BNN)提供了一种有希望的解决方案,可以将参数密集的深度单像超分辨率(SISR)模型部署到具有有限的存储和计算资源的真实设备上。为了与全精度对应物实现可比的性能,大多数现有的SISR现有BNN主要集中在补偿网络中的二进制权重和激活所产生的信息损失,通过更好地近似于二进制的卷积。在这项研究中,我们重新审视了BNN及其全精度对应物之间的差异,并认为BNN的良好概括性能的关键在于保留完整的完整精确信息流以及经过每个二进制卷积层的准确梯度流。受此启发的启发,我们建议在整个网络上引入一个完整的跳过连接或其在每个二元卷积层上的变体,这可以提高向前表达能力和背部传播梯度的准确性,从而提高概括性能。更重要的是,此类方案适用于任何现有的BNN骨架用于SISR,而无需引入任何其他计算成本。为了证明其功效,我们使用四个基准数据集中的SISR进行了四个不同的骨架对其进行评估,并报告明显优于现有BNN甚至一些4位竞争对手。

Binary neural network (BNN) provides a promising solution to deploy parameter-intensive deep single image super-resolution (SISR) models onto real devices with limited storage and computational resources. To achieve comparable performance with the full-precision counterpart, most existing BNNs for SISR mainly focus on compensating the information loss incurred by binarizing weights and activations in the network through better approximations to the binarized convolution. In this study, we revisit the difference between BNNs and their full-precision counterparts and argue that the key for good generalization performance of BNNs lies on preserving a complete full-precision information flow as well as an accurate gradient flow passing through each binarized convolution layer. Inspired by this, we propose to introduce a full-precision skip connection or its variant over each binarized convolution layer across the entire network, which can increase the forward expressive capability and the accuracy of back-propagated gradient, thus enhancing the generalization performance. More importantly, such a scheme is applicable to any existing BNN backbones for SISR without introducing any additional computation cost. To testify its efficacy, we evaluate it using four different backbones for SISR on four benchmark datasets and report obviously superior performance over existing BNNs and even some 4-bit competitors.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源