论文标题

通过在集体增量学习中使用多任务学习来改善功能通用性

Improving Feature Generalizability with Multitask Learning in Class Incremental Learning

论文作者

Ma, Dong, Tang, Chi Ian, Mascolo, Cecilia

论文摘要

许多深度学习应用程序,例如关键字斑点,都需要随着时间的推移将新概念(类)纳入,称为类增量学习(CIL)。 CIL中的主要挑战是灾难性的遗忘,即在学习新任务时保留尽可能多的旧知识。已经提出了各种技术,例如正则化,知识蒸馏和使用示例,以解决此问题。但是,先前的工作主要集中于增量学习步骤,同时忽略基本模型培训期间的优化。我们假设基本模型的更可转移和可转换的特征表示将对增量学习有益。 在这项工作中,我们在基本模型培训期间采用多任务学习来提高功能的推广性。具体来说,我们将基本类分解为多个子集,而不是将单个模型培训,而是将其视为一项任务。这些任务是同时培训的,并获得了共享的功能提取器以进行增量学习。我们在各种配置下在两个数据集上评估了我们的方法。结果表明,我们的方法可提高平均增量学习精度高达5.5%,这可以随着时间的推移而更可靠和准确的关键字斑点。此外,建议的方法可以与许多现有技术结合使用,并提供额外的性能增益。

Many deep learning applications, like keyword spotting, require the incorporation of new concepts (classes) over time, referred to as Class Incremental Learning (CIL). The major challenge in CIL is catastrophic forgetting, i.e., preserving as much of the old knowledge as possible while learning new tasks. Various techniques, such as regularization, knowledge distillation, and the use of exemplars, have been proposed to resolve this issue. However, prior works primarily focus on the incremental learning step, while ignoring the optimization during the base model training. We hypothesize that a more transferable and generalizable feature representation from the base model would be beneficial to incremental learning. In this work, we adopt multitask learning during base model training to improve the feature generalizability. Specifically, instead of training a single model with all the base classes, we decompose the base classes into multiple subsets and regard each of them as a task. These tasks are trained concurrently and a shared feature extractor is obtained for incremental learning. We evaluate our approach on two datasets under various configurations. The results show that our approach enhances the average incremental learning accuracy by up to 5.5%, which enables more reliable and accurate keyword spotting over time. Moreover, the proposed approach can be combined with many existing techniques and provides additional performance gain.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源