论文标题

注意机制的混合策略

Mix-Pooling Strategy for Attention Mechanism

论文作者

Zhong, Shanshan, Wen, Wushao, Qin, Jinghui

论文摘要

最近,提出了许多有效的关注模块来通过在计算机视觉中利用卷积神经网络的内部信息来启动模型性能。通常,考虑到注意机制的汇总策略的设计,许多以前的作品忽略了,因为它们采用了全球平均水平,这是理所当然的,这阻碍了注意机制的表现进一步改善。但是,我们凭经验发现并验证了一种现象,即全球最大量和全球末端锻炼的简单线性组合可以产生汇总策略,这些策略符合或超过全球平均池的性能。基于这一经验观察,我们提出了一个简单的注意力模块SPEM,该模块SPEM基于全球最大程度和全球最小良好的自适应合并策略,以及一个用于生成注意力图的轻量级模块。 SPEM的有效性通过广泛使用的基准数据集和流行注意力网络的广泛实验证明。

Recently many effective attention modules are proposed to boot the model performance by exploiting the internal information of convolutional neural networks in computer vision. In general, many previous works ignore considering the design of the pooling strategy of the attention mechanism since they adopt the global average pooling for granted, which hinders the further improvement of the performance of the attention mechanism. However, we empirically find and verify a phenomenon that the simple linear combination of global max-pooling and global min-pooling can produce pooling strategies that match or exceed the performance of global average pooling. Based on this empirical observation, we propose a simple-yet-effective attention module SPEM, which adopts a self-adaptive pooling strategy based on global max-pooling and global min-pooling and a lightweight module for producing the attention map. The effectiveness of SPEM is demonstrated by extensive experiments on widely-used benchmark datasets and popular attention networks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源