论文标题

自适应细粒基于草图的图像检索

Adaptive Fine-Grained Sketch-Based Image Retrieval

论文作者

Bhunia, Ayan Kumar, Sain, Aneeshan, Shah, Parth, Gupta, Animesh, Chowdhury, Pinaki Nath, Xiang, Tao, Song, Yi-Zhe

论文摘要

最近对基于细粒的基于草图的图像检索(FG-SBIR)的重点已转向将模型概括为新类别,而无需任何培训数据。但是,在实际应用中,经过训练的FG-SBIR模型通常应用于新类别和不同的人类素描器,即不同的绘图样式。尽管这使概括问题复杂化,但幸运的是,通常可以使用一些示例,从而使模型适应新的类别/样式。在本文中,我们提供了一种新颖的观点 - 我们没有要求使用概括的模型,而是提倡快速适应的模型,在测试过程中仅有很少的样本(以几种方式)。为了解决这个新问题,我们介绍了一种基于几个关键修改的基于新型的模型不合时宜的元学习(MAML)框架:(1)作为基于边缘的对比损失的检索任务,我们简化了内部环中的MAML训练,以使其变得更稳定和拖延。 (2)我们的对比度损失的边缘也通过其余模型进行了元学习。 (3)在外循环中引入了另外三个正规化损失,以使元学习的FG-SBIR模型对类别/样式适应更有效。在公共数据集上进行的广泛实验表明,基于概括和基于零射击的方法和一些强大的射击基线有很大的收益。

The recent focus on Fine-Grained Sketch-Based Image Retrieval (FG-SBIR) has shifted towards generalising a model to new categories without any training data from them. In real-world applications, however, a trained FG-SBIR model is often applied to both new categories and different human sketchers, i.e., different drawing styles. Although this complicates the generalisation problem, fortunately, a handful of examples are typically available, enabling the model to adapt to the new category/style. In this paper, we offer a novel perspective -- instead of asking for a model that generalises, we advocate for one that quickly adapts, with just very few samples during testing (in a few-shot manner). To solve this new problem, we introduce a novel model-agnostic meta-learning (MAML) based framework with several key modifications: (1) As a retrieval task with a margin-based contrastive loss, we simplify the MAML training in the inner loop to make it more stable and tractable. (2) The margin in our contrastive loss is also meta-learned with the rest of the model. (3) Three additional regularisation losses are introduced in the outer loop, to make the meta-learned FG-SBIR model more effective for category/style adaptation. Extensive experiments on public datasets suggest a large gain over generalisation and zero-shot based approaches, and a few strong few-shot baselines.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源