论文标题

神经崩溃启发的吸引力 - 抑制平衡的损失,无法平衡学习

Neural Collapse Inspired Attraction-Repulsion-Balanced Loss for Imbalanced Learning

论文作者

Xie, Liang, Yang, Yibo, Cai, Deng, He, Xiaofei

论文摘要

现实世界工程中广泛存在类不平衡分布。但是,试图最小化误差的主流优化算法在面对极端类不平衡时会将深度学习模型捕获为次级最佳。它严重损害了分类精度,尤其是在次要阶级上。基本原因是分类器权重的梯度在不同类别的组成部分之间存在不平衡。在本文中,我们提出了吸引力 - 抑制平衡损失(ARB损失),以平衡梯度的不同组成部分。我们对大规模分类和细分数据集进行实验,我们的ARB损失只能通过一阶段训练而不是如今的2阶段学习来实现最先进的性能。

Class imbalance distribution widely exists in real-world engineering. However, the mainstream optimization algorithms that seek to minimize error will trap the deep learning model in sub-optimums when facing extreme class imbalance. It seriously harms the classification precision, especially on the minor classes. The essential reason is that the gradients of the classifier weights are imbalanced among the components from different classes. In this paper, we propose Attraction-Repulsion-Balanced Loss (ARB-Loss) to balance the different components of the gradients. We perform experiments on the large-scale classification and segmentation datasets and our ARB-Loss can achieve state-of-the-art performance via only one-stage training instead of 2-stage learning like nowadays SOTA works.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源