论文标题
在探索细粒图像检索的增量学习
On the Exploration of Incremental Learning for Fine-grained Image Retrieval
论文作者
论文摘要
在本文中,当随着时间的推移添加新类别时,我们将考虑在增量设置中进行细粒度图像检索的问题。一方面,反复训练扩展数据集上的表示形式很耗时。另一方面,仅通过新课程对学习的表示形式进行微调会导致灾难性的遗忘。为此,我们提出了一种增量学习方法,以减轻遗忘问题引起的检索性能降解。在不访问原始类的任何样本的情况下,原始网络的分类器提供了软的“标签”来转移知识以训练自适应网络,以保留先前的分类功能。更重要的是,基于最大平均差异的正则化函数是为了最大程度地减少原始网络和自适应网络的新类功能的差异。在两个数据集上进行的广泛实验表明,我们的方法有效地减轻了原始班级的灾难性遗忘,同时在新班上获得了高性能。
In this paper, we consider the problem of fine-grained image retrieval in an incremental setting, when new categories are added over time. On the one hand, repeatedly training the representation on the extended dataset is time-consuming. On the other hand, fine-tuning the learned representation only with the new classes leads to catastrophic forgetting. To this end, we propose an incremental learning method to mitigate retrieval performance degradation caused by the forgetting issue. Without accessing any samples of the original classes, the classifier of the original network provides soft "labels" to transfer knowledge to train the adaptive network, so as to preserve the previous capability for classification. More importantly, a regularization function based on Maximum Mean Discrepancy is devised to minimize the discrepancy of new classes features from the original network and the adaptive network, respectively. Extensive experiments on two datasets show that our method effectively mitigates the catastrophic forgetting on the original classes while achieving high performance on the new classes.