论文标题

无示例性班级学习的自我启动知识代表

Self-distilled Knowledge Delegator for Exemplar-free Class Incremental Learning

论文作者

Ye, Fanfan, Ma, Liang, Zhong, Qiaoyong, Xie, Di, Pu, Shiliang

论文摘要

由于来自旧任务的数据的无法访问,无示例性的增量学习极具挑战性。在本文中,我们试图利用以前训练的分类模型中编码的知识来处理持续学习中的灾难性遗忘问题。具体来说,我们引入了所谓的知识代表,该代表们能够通过生成信息样本将知识从训练的模型转移到随机重新定位的新模型。仅考虑到先前的模型,使用无数据的方式使用自我验证机制有效地学习了代表。然后,利用代表们提取的知识来维持模型在增量学习中的旧任务上的性能。这个简单的增量学习框架在四个广泛使用的类增量基准(即Cifar-100,Imagenet-Sububset,caltech-101和Flowers-102)上,超过了无示例的方法。值得注意的是,我们在不访问任何示例的情况下实现了与某些基于典范的方法相当的性能。

Exemplar-free incremental learning is extremely challenging due to inaccessibility of data from old tasks. In this paper, we attempt to exploit the knowledge encoded in a previously trained classification model to handle the catastrophic forgetting problem in continual learning. Specifically, we introduce a so-called knowledge delegator, which is capable of transferring knowledge from the trained model to a randomly re-initialized new model by generating informative samples. Given the previous model only, the delegator is effectively learned using a self-distillation mechanism in a data-free manner. The knowledge extracted by the delegator is then utilized to maintain the performance of the model on old tasks in incremental learning. This simple incremental learning framework surpasses existing exemplar-free methods by a large margin on four widely used class incremental benchmarks, namely CIFAR-100, ImageNet-Subset, Caltech-101 and Flowers-102. Notably, we achieve comparable performance to some exemplar-based methods without accessing any exemplars.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源