论文标题
无参数的平均注意力可以免费提高卷积神经网络性能(几乎)
Parameter-Free Average Attention Improves Convolutional Neural Network Performance (Almost) Free of Charge
论文作者
论文摘要
视觉感知是由周围世界中相关方面的关注所驱动的。为了将这一观察结果转移到计算机的数字信息处理中,已经引入了注意机制以突出显着图像区域。在这里,我们引入了一种称为PFAAM的无参数注意机制,这是一个简单而有效的模块。它可以将其插入各种卷积神经网络架构中,并带有一些计算开销,而不会影响模型大小。对PFAAM进行了多个架构测试,用于分类和细分细分,从而改善了所有测试案例的模型性能。这证明了它作为计算机视觉任务的一般易于使用的模块的广泛适用性。可以在https://github.com/nkoerb/pfaam上找到PFAAM的实现。
Visual perception is driven by the focus on relevant aspects in the surrounding world. To transfer this observation to the digital information processing of computers, attention mechanisms have been introduced to highlight salient image regions. Here, we introduce a parameter-free attention mechanism called PfAAM, that is a simple yet effective module. It can be plugged into various convolutional neural network architectures with a little computational overhead and without affecting model size. PfAAM was tested on multiple architectures for classification and segmentic segmentation leading to improved model performance for all tested cases. This demonstrates its wide applicability as a general easy-to-use module for computer vision tasks. The implementation of PfAAM can be found on https://github.com/nkoerb/pfaam.