论文标题

无数据增量学习的类印象

Class Impression for Data-free Incremental Learning

论文作者

Ayromlou, Sana, Abolmaesumi, Purang, Tsang, Teresa, Li, Xiaoxiao

论文摘要

基于标准的深度学习分类方法需要提前从所有课程中收集所有样本,并经过培训。在实际临床应用中,这种范式可能不实用,在现实世界中,通过添加新数据来逐步引入新类别。班级学习是一种允许从此类数据学习的策略。但是,一个主要的挑战是灾难性的遗忘,即,将训练有素的模型适应新数据时,以前的班级的性能退化。减轻此挑战的先前方法可以节省一部分培训数据,需要永久存储此类数据,这些数据可能引入隐私问题。在这里,我们提出了一个新颖的无数据类增量学习框架,该框架首先综合了从以前类中训练的模型中生成\我们的数据。随后,它通过将综合数据与新类数据相结合来更新模型。此外,我们结合了余弦归一化的横向渗透损失,以减轻不平衡的不利影响,增加以前类别和新类别的分离的边缘损失,以及域内对比度损失,以概括对合成数据训练的模型对真实数据的训练。我们将我们提出的框架与类增量学习中的最先进方法进行了比较,在该方法中,我们证明了11,062个超声心动图Cine Cine系列患者的准确性提高。

Standard deep learning-based classification approaches require collecting all samples from all classes in advance and are trained offline. This paradigm may not be practical in real-world clinical applications, where new classes are incrementally introduced through the addition of new data. Class incremental learning is a strategy allowing learning from such data. However, a major challenge is catastrophic forgetting, i.e., performance degradation on previous classes when adapting a trained model to new data. Prior methodologies to alleviate this challenge save a portion of training data require perpetual storage of such data that may introduce privacy issues. Here, we propose a novel data-free class incremental learning framework that first synthesizes data from the model trained on previous classes to generate a \ours. Subsequently, it updates the model by combining the synthesized data with new class data. Furthermore, we incorporate a cosine normalized Cross-entropy loss to mitigate the adverse effects of the imbalance, a margin loss to increase separation among previous classes and new ones, and an intra-domain contrastive loss to generalize the model trained on the synthesized data to real data. We compare our proposed framework with state-of-the-art methods in class incremental learning, where we demonstrate improvement in accuracy for the classification of 11,062 echocardiography cine series of patients.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源