论文标题
通过增强的线性变换,批准归一化
Batch Normalization with Enhanced Linear Transformation
论文作者
论文摘要
批发归一化(BN)是现代深网的基本单元,在该网络中,线性转换模块旨在提高BN拟合复杂数据分布的灵活性。在本文中,我们证明了正确增强该线性变换模块可以有效地提高BN的能力。具体而言,我们建议不使用单个神经元,而是建议考虑每个神经元的邻域来计算线性转换的输出。我们的方法名为BNET,可以在大多数深度学习库中使用2-3行代码实现。尽管很简单,但BNET在各种主机和视觉基准中都带来了一致的性能。此外,我们验证了BNET通过相应地分配重要权重的重要神经元来加速网络训练的收敛并增强空间信息。该代码可在https://github.com/yuhuixu1993/bnet上找到。
Batch normalization (BN) is a fundamental unit in modern deep networks, in which a linear transformation module was designed for improving BN's flexibility of fitting complex data distributions. In this paper, we demonstrate properly enhancing this linear transformation module can effectively improve the ability of BN. Specifically, rather than using a single neuron, we propose to additionally consider each neuron's neighborhood for calculating the outputs of the linear transformation. Our method, named BNET, can be implemented with 2-3 lines of code in most deep learning libraries. Despite the simplicity, BNET brings consistent performance gains over a wide range of backbones and visual benchmarks. Moreover, we verify that BNET accelerates the convergence of network training and enhances spatial information by assigning the important neurons with larger weights accordingly. The code is available at https://github.com/yuhuixu1993/BNET.