论文标题

福斯特:课堂学习的功能提升和压缩

FOSTER: Feature Boosting and Compression for Class-Incremental Learning

论文作者

Wang, Fu-Yun, Zhou, Da-Wei, Ye, Han-Jia, Zhan, De-Chuan

论文摘要

在这个不断变化的世界中,必须不断学习新概念的能力。但是,深层神经网络在学习新类别时会遭受灾难性的遗忘。已经提出了许多减轻这种现象的作品,而其中大多数属于稳定性困境,或者在头顶上采用过多的计算或存储。受到梯度增强算法的启发,逐渐适应目标模型和以前的集合模型之间的残差,我们提出了一种新颖的两阶段学习范式养育范围,使该模型能够自适应学习新类别。具体而言,我们首先动态扩展新模块,以适合原始模型的目标和输出之间的残差。接下来,我们通过有效的蒸馏策略删除冗余参数和特征尺寸,以维护单个骨干模型。我们在不同的设置下验证了CIFAR-100和Imagenet-100/1000上的方法寄养。实验结果表明,我们的方法实现了最先进的性能。代码可在以下网址提供:https://github.com/g-u-n/eccv22-foster。

The ability to learn new concepts continually is necessary in this ever-changing world. However, deep neural networks suffer from catastrophic forgetting when learning new categories. Many works have been proposed to alleviate this phenomenon, whereas most of them either fall into the stability-plasticity dilemma or take too much computation or storage overhead. Inspired by the gradient boosting algorithm to gradually fit the residuals between the target model and the previous ensemble model, we propose a novel two-stage learning paradigm FOSTER, empowering the model to learn new categories adaptively. Specifically, we first dynamically expand new modules to fit the residuals between the target and the output of the original model. Next, we remove redundant parameters and feature dimensions through an effective distillation strategy to maintain the single backbone model. We validate our method FOSTER on CIFAR-100 and ImageNet-100/1000 under different settings. Experimental results show that our method achieves state-of-the-art performance. Code is available at: https://github.com/G-U-N/ECCV22-FOSTER.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源