论文标题

实例感知的核分割的自我监督学习

Instance-aware Self-supervised Learning for Nuclei Segmentation

论文作者

Xie, Xinpeng, Chen, Jiawei, Li, Yuexiang, Shen, Linlin, Ma, Kai, Zheng, Yefeng

论文摘要

由于核的存在和较大的形态差异,精确的核实例分割仍然是计算病理学中最具挑战性的任务之一。核实例的注释,要求经验丰富的病理学家手动绘制轮廓,这是非常费力且昂贵的,这通常会导致注释数据的不足。高度依赖培训数据数量的基于深度学习的细分方法很难完全证明其在这一领域的能力。在本文中,我们提出了一个新颖的自我监督学习框架,以深入利用广泛使用的卷积神经网络(CNN)对Nuclei实例分割任务的能力。所提出的方法涉及两个子任务(即规模的三胞胎学习和计数排名),这使神经网络能够隐含地利用核大小和数量的先前知识,因此从原始数据中挖掘出实例感知特征表示。公开可用的Monuseg数据集的实验结果表明,提出的自我监督学习方法可以显着提高核实例的分割精度---我们的自我监督重新启动的重新启动remunetet-101实现了一个新的新的平均总汇总Jaccard指数(AJI)为70.63%。据我们所知,这是第一项着重于自我监督学习的工作,例如细分。

Due to the wide existence and large morphological variances of nuclei, accurate nuclei instance segmentation is still one of the most challenging tasks in computational pathology. The annotating of nuclei instances, requiring experienced pathologists to manually draw the contours, is extremely laborious and expensive, which often results in the deficiency of annotated data. The deep learning based segmentation approaches, which highly rely on the quantity of training data, are difficult to fully demonstrate their capacity in this area. In this paper, we propose a novel self-supervised learning framework to deeply exploit the capacity of widely-used convolutional neural networks (CNNs) on the nuclei instance segmentation task. The proposed approach involves two sub-tasks (i.e., scale-wise triplet learning and count ranking), which enable neural networks to implicitly leverage the prior-knowledge of nuclei size and quantity, and accordingly mine the instance-aware feature representations from the raw data. Experimental results on the publicly available MoNuSeg dataset show that the proposed self-supervised learning approach can remarkably boost the segmentation accuracy of nuclei instance---a new state-of-the-art average Aggregated Jaccard Index (AJI) of 70.63%, is achieved by our self-supervised ResUNet-101. To our best knowledge, this is the first work focusing on the self-supervised learning for instance segmentation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源