论文标题

中间原型挖掘变压器,用于几个射击语义分割

Intermediate Prototype Mining Transformer for Few-Shot Semantic Segmentation

论文作者

Liu, Yuanwei, Liu, Nian, Yao, Xiwen, Han, Junwei

论文摘要

很少有语义分割旨在在一些带注释的支持图像的条件下将查询中的目标对象细分为查询对象。大多数以前的作品都努力从支持中挖掘出更有效的类别信息,以与查询中的相应对象匹配。但是,他们都忽略了查询和支持图像之间的类别信息差距。如果它们中的对象显示出较大的类内多样性,则强行将类别信息从支持到查询的类别信息无效。为了解决这个问题,我们是第一个引入中间原型,用于从支持和自适应类别知识中挖掘两个确定性类别信息。具体而言,我们设计了一个中间原型挖掘变压器(IPMT),以迭代方式学习原型。在每个IPMT层中,我们将对象信息传播到原型中的支持和查询功能中,然后使用它激活查询功能映射。通过迭代进行此过程,可以逐步改善中间原型和查询功能。最后,最终查询功能用于产生精确的分割预测。对Pascal-5i和CoCo-20i数据集进行了广泛的实验清楚地验证了我们的IPMT的有效性,并表明它的表现超过了先前的最新方法。代码可从https://github.com/liuyuanwei98/ipmt获得

Few-shot semantic segmentation aims to segment the target objects in query under the condition of a few annotated support images. Most previous works strive to mine more effective category information from the support to match with the corresponding objects in query. However, they all ignored the category information gap between query and support images. If the objects in them show large intra-class diversity, forcibly migrating the category information from the support to the query is ineffective. To solve this problem, we are the first to introduce an intermediate prototype for mining both deterministic category information from the support and adaptive category knowledge from the query. Specifically, we design an Intermediate Prototype Mining Transformer (IPMT) to learn the prototype in an iterative way. In each IPMT layer, we propagate the object information in both support and query features to the prototype and then use it to activate the query feature map. By conducting this process iteratively, both the intermediate prototype and the query feature can be progressively improved. At last, the final query feature is used to yield precise segmentation prediction. Extensive experiments on both PASCAL-5i and COCO-20i datasets clearly verify the effectiveness of our IPMT and show that it outperforms previous state-of-the-art methods by a large margin. Code is available at https://github.com/LIUYUANWEI98/IPMT

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源