论文标题
斑点损失:实例不平衡意识到语义细分的损失功能
blob loss: instance imbalance aware loss functions for semantic segmentation
论文作者
论文摘要
深度卷积神经网络(CNN)已被证明在语义分割任务中非常有效。引入了最受欢迎的损失功能,以提高体积分数,例如骰子系数(DSC)。通过设计,DSC可以解决类别的不平衡,但是,它无法识别类中的实例失衡。结果,大型前景实例可以主导次要实例,并且仍然产生令人满意的DSC。然而,检测微小的实例对于许多应用,例如疾病监测至关重要。例如,必须在多发性硬化症患者的随访中定位和监视小规模病变。我们提出了一个新型的损失功能系列\ emph {blob损失},主要旨在最大化实例级检测指标,例如F1得分和灵敏度。 \ emph {blob Loss}设计用于检测多个实例很重要的语义分割问题。我们在五个复杂的3D语义分割任务中广泛评估了基于DSC的\ Emph {Blob损失},这些任务具有明显的实例异质性,从纹理和形态学来看。与软掷骰损失相比,我们在MS病变中提高了5%,肝肿瘤提高了3%,考虑F1分数的显微镜分割任务的平均提高了2%。
Deep convolutional neural networks (CNN) have proven to be remarkably effective in semantic segmentation tasks. Most popular loss functions were introduced targeting improved volumetric scores, such as the Dice coefficient (DSC). By design, DSC can tackle class imbalance, however, it does not recognize instance imbalance within a class. As a result, a large foreground instance can dominate minor instances and still produce a satisfactory DSC. Nevertheless, detecting tiny instances is crucial for many applications, such as disease monitoring. For example, it is imperative to locate and surveil small-scale lesions in the follow-up of multiple sclerosis patients. We propose a novel family of loss functions, \emph{blob loss}, primarily aimed at maximizing instance-level detection metrics, such as F1 score and sensitivity. \emph{Blob loss} is designed for semantic segmentation problems where detecting multiple instances matters. We extensively evaluate a DSC-based \emph{blob loss} in five complex 3D semantic segmentation tasks featuring pronounced instance heterogeneity in terms of texture and morphology. Compared to soft Dice loss, we achieve 5% improvement for MS lesions, 3% improvement for liver tumor, and an average 2% improvement for microscopy segmentation tasks considering F1 score.