论文标题

稳健学习的动态损失

Dynamic Loss For Robust Learning

论文作者

Jiang, Shenwang, Li, Jianan, Zhang, Jizhou, Wang, Ying, Xu, Tingfa

论文摘要

在现实世界中,标签噪声和类不平衡通常共存。但是,以前的鲁棒学学习工作通常要解决一种类型的数据偏见,并且在面对两者时表现不佳。为了减轻这一差距,这项工作提出了一种新型的基于元学习的动态损失,该损失会通过训练过程自动调整目标功能,从而从长期嘈杂的数据中鲁棒地学习分类器。具体而言,我们的动态损失包括标签校正器和边距生成器,它们分别纠正了嘈杂的标签,并通过感知基础数据分布以及分类器的学习状态来纠正噪声标签并生成添加剂的每级分类边距。配备了一种新的分层抽样策略,该策略丰富了少数没有偏见的元数据,并具有多样的硬样品,通过元学习共同优化了动态损失中的两个组成部分,并培养分类器以很好地适应清洁和平衡的测试数据。广泛的实验表明,我们的方法在多个现实世界和合成数据集上实现了具有各种类型的数据偏见的最新精度,包括CIFAR-10/100,Animal-10N,Imagenet-LT和WebVision。代码将很快公开可用。

Label noise and class imbalance commonly coexist in real-world data. Previous works for robust learning, however, usually address either one type of the data biases and underperform when facing them both. To mitigate this gap, this work presents a novel meta-learning based dynamic loss that automatically adjusts the objective functions with the training process to robustly learn a classifier from long-tailed noisy data. Concretely, our dynamic loss comprises a label corrector and a margin generator, which respectively correct noisy labels and generate additive per-class classification margins by perceiving the underlying data distribution as well as the learning state of the classifier. Equipped with a new hierarchical sampling strategy that enriches a small amount of unbiased metadata with diverse and hard samples, the two components in the dynamic loss are optimized jointly through meta-learning and cultivate the classifier to well adapt to clean and balanced test data. Extensive experiments show our method achieves state-of-the-art accuracy on multiple real-world and synthetic datasets with various types of data biases, including CIFAR-10/100, Animal-10N, ImageNet-LT, and Webvision. Code will soon be publicly available.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源