论文标题

班级学习的语义漂移补偿

Semantic Drift Compensation for Class-Incremental Learning

论文作者

Yu, Lu, Twardowski, Bartłomiej, Liu, Xialei, Herranz, Luis, Wang, Kai, Cheng, Yongmei, Jui, Shangling, van de Weijer, Joost

论文摘要

深层网络的课程学习会依次增加要分类的类的数量。在培训期间,网络一次仅访问一个任务的数据,其中每个任务都包含几个类。在这种情况下,网络遭受了灾难性遗忘的困扰,这是指以前任务的性能大幅下降。绝大多数方法都研究了分类网络的情况,在每个新任务中,必须增加网络的分类层,并增加额外的权重,以便为新添加的类别提供空间。嵌入网络的优势是,新类可以自然包含在网络中,而无需添加新的权重。因此,我们研究嵌入网络的增量学习。此外,我们提出了一种新方法,以估计流漂移(称为语义漂移)的特征,并在不需要任何示例的情况下进行补偿。我们基于当前任务数据所经历的漂移来近似先前任务的漂移。我们在细粒的数据集,CIFAR100和Imagenet-Subset上执行实验。我们证明,嵌入网络因灾难性遗忘而遭受的损失少得多。与存储示例的方法相比,我们胜过不需要示例并获得竞争结果的现有方法。此外,我们表明我们提出的SDC与现有方法结合使用以防止忘记持续改善结果。

Class-incremental learning of deep networks sequentially increases the number of classes to be classified. During training, the network has only access to data of one task at a time, where each task contains several classes. In this setting, networks suffer from catastrophic forgetting which refers to the drastic drop in performance on previous tasks. The vast majority of methods have studied this scenario for classification networks, where for each new task the classification layer of the network must be augmented with additional weights to make room for the newly added classes. Embedding networks have the advantage that new classes can be naturally included into the network without adding new weights. Therefore, we study incremental learning for embedding networks. In addition, we propose a new method to estimate the drift, called semantic drift, of features and compensate for it without the need of any exemplars. We approximate the drift of previous tasks based on the drift that is experienced by current task data. We perform experiments on fine-grained datasets, CIFAR100 and ImageNet-Subset. We demonstrate that embedding networks suffer significantly less from catastrophic forgetting. We outperform existing methods which do not require exemplars and obtain competitive results compared to methods which store exemplars. Furthermore, we show that our proposed SDC when combined with existing methods to prevent forgetting consistently improves results.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源