论文标题

可学习的分配校准,用于几次班级学习

Learnable Distribution Calibration for Few-Shot Class-Incremental Learning

论文作者

Liu, Binghao, Yang, Boyu, Xie, Lingxi, Wang, Ren, Tian, Qi, Ye, Qixiang

论文摘要

很少有课堂学习(FSCIL)面临着记住旧班级分布并估算几乎没有培训样本的新班级分布的挑战。在这项研究中,我们提出了一种可学习的分布校准(LDC)方法,目的是使用统一框架系统地解决这两个挑战。 LDC建立在参数化校准单元(PCU)的基础上,该校准单元初始化了基于分类器向量(无存储器)和单个协方差矩阵的所有类的偏差分布。所有类共享协方差矩阵,以便确定内存成本。在基础培训期间,PCU具有通过在真实分布的监督下反复更新采样功能来校准有偏分布的能力。在增量学习过程中,PCU恢复了旧课程的分布,以避免“忘记”,并估算分布和增强样本的新课程,以减轻几个样本的偏见分布引起的“过度拟合”。理论上通过格式化变异推理程序是合理的。它提高了FSCIL的灵活性,因为培训程序需要先验的班级相似性。 Cub200,CIFAR100和迷你IMAGENET数据集的实验表明,LDC的表现分别优于最新面临的4.64%,1.98%和3.97%。也在几乎没有学习的学习方案上还验证了最不发达国家的有效性。

Few-shot class-incremental learning (FSCIL) faces challenges of memorizing old class distributions and estimating new class distributions given few training samples. In this study, we propose a learnable distribution calibration (LDC) approach, with the aim to systematically solve these two challenges using a unified framework. LDC is built upon a parameterized calibration unit (PCU), which initializes biased distributions for all classes based on classifier vectors (memory-free) and a single covariance matrix. The covariance matrix is shared by all classes, so that the memory costs are fixed. During base training, PCU is endowed with the ability to calibrate biased distributions by recurrently updating sampled features under the supervision of real distributions. During incremental learning, PCU recovers distributions for old classes to avoid `forgetting', as well as estimating distributions and augmenting samples for new classes to alleviate `over-fitting' caused by the biased distributions of few-shot samples. LDC is theoretically plausible by formatting a variational inference procedure. It improves FSCIL's flexibility as the training procedure requires no class similarity priori. Experiments on CUB200, CIFAR100, and mini-ImageNet datasets show that LDC outperforms the state-of-the-arts by 4.64%, 1.98%, and 3.97%, respectively. LDC's effectiveness is also validated on few-shot learning scenarios.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源