论文标题
SATR:用于通用病变检测的变压器的切片关注
SATr: Slice Attention with Transformer for Universal Lesion Detection
论文作者
论文摘要
计算机断层扫描中的通用病变检测(ULD)在计算机辅助诊断中起着至关重要的作用。通过多片输入检测方法报告了有希望的ULD结果,这些方法从多个相邻的CT切片中进行了模拟3D上下文模型,但是由于它们仅使用基于卷积的融合操作,因此此类方法仍然遇到困难,因为它们仅使用基于卷积的融合操作。在本文中,我们提出了一个新型的切片注意力变压器(SATR)块,可以轻松地插入基于卷积的ULD主干中以形成混合网络结构。这种新形成的混合主骨架可以通过在变压器块中的级联自我发项模块进行更好的建模长距离特征依赖性,同时仍然保持着强大的力量,可以使用原始骨干链中的卷积操作对局部特征进行建模。使用五种最先进方法的实验表明,所提出的SATR块几乎可以免费提高病变检测精度,而无需额外的超参数或特殊的网络设计。
Universal Lesion Detection (ULD) in computed tomography plays an essential role in computer-aided diagnosis. Promising ULD results have been reported by multi-slice-input detection approaches which model 3D context from multiple adjacent CT slices, but such methods still experience difficulty in obtaining a global representation among different slices and within each individual slice since they only use convolution-based fusion operations. In this paper, we propose a novel Slice Attention Transformer (SATr) block which can be easily plugged into convolution-based ULD backbones to form hybrid network structures. Such newly formed hybrid backbones can better model long-distance feature dependency via the cascaded self-attention modules in the Transformer block while still holding a strong power of modeling local features with the convolutional operations in the original backbone. Experiments with five state-of-the-art methods show that the proposed SATr block can provide an almost free boost to lesion detection accuracy without extra hyperparameters or special network designs.