论文标题

SATS:持续语义细分的自我注意转移

SATS: Self-Attention Transfer for Continual Semantic Segmentation

论文作者

Qiu, Yiqiao, Shen, Yixing, Sun, Zhuohao, Zheng, Yanchong, Chang, Xiaobin, Zheng, Weishi, Wang, Ruixuan

论文摘要

持续学习分割越来越多的图像区域是许多智能系统的理想能力。但是,这种连续的语义分割遭受了与持续分类学习相同的灾难性遗忘问题。尽管最初用于连续分类的多种知识蒸馏策略已经很好地适应了持续的语义细分,但他们仅考虑根据从一个或多个深度完全卷积网络的输出转移旧知识。与现有解决方案不同,本研究提议转移与知识相关的新类型信息,即每个图像中元素(例如像素或小型本地区域)之间的关系,这些关系可以捕获课堂内和类知识之间的关系。可以从变压器式分割模型中的自我发项图有效地获得关系信息。考虑到每个图像中属于同一类的像素通常具有相似的视觉属性,因此应用特定于类的区域池来为知识传输提供更有效的关系信息。对多种公共基准的广泛评估支持,提出的自我发挥转移方法可以进一步有效地减轻灾难性遗忘问题,并且其与一种或多种采用的策略的灵活组合显着优于最先进的解决方案。

Continually learning to segment more and more types of image regions is a desired capability for many intelligent systems. However, such continual semantic segmentation suffers from the same catastrophic forgetting issue as in continual classification learning. While multiple knowledge distillation strategies originally for continual classification have been well adapted to continual semantic segmentation, they only consider transferring old knowledge based on the outputs from one or more layers of deep fully convolutional networks. Different from existing solutions, this study proposes to transfer a new type of information relevant to knowledge, i.e. the relationships between elements (Eg. pixels or small local regions) within each image which can capture both within-class and between-class knowledge. The relationship information can be effectively obtained from the self-attention maps in a Transformer-style segmentation model. Considering that pixels belonging to the same class in each image often share similar visual properties, a class-specific region pooling is applied to provide more efficient relationship information for knowledge transfer. Extensive evaluations on multiple public benchmarks support that the proposed self-attention transfer method can further effectively alleviate the catastrophic forgetting issue, and its flexible combination with one or more widely adopted strategies significantly outperforms state-of-the-art solutions.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源