论文标题

HER2幻灯片的超像素预分段以有效注释

Superpixel Pre-Segmentation of HER2 Slides for Efficient Annotation

论文作者

Öttl, Mathias, Mönius, Jana, Marzahl, Christian, Rübner, Matthias, Geppert, Carol I., Hartmann, Arndt, Beckmann, Matthias W., Fasching, Peter, Maier, Andreas, Erber, Ramona, Breininger, Katharina

论文摘要

有监督的深度学习表明,包括组织病理学和癌症研究在内的不同应用,用于医学图像分割的最新性能;但是,此类数据的手动注释非常艰辛。在这项工作中,我们探讨了使用超像素方法来计算HER2染色图像进行乳腺癌诊断的预分段,以促进第二步的手动注释和校正速度。比较了四种方法:基于预告片的Resnet-50和DeNoising AutoCododer的特征嵌入,标准的简单线性迭代聚类(SLIC)作为基线,适应域的SLIC和超级像素。为了解决过度段,我们根据其在相应的特征空间中的内容,建议层次合并Superpixels。在评估完全手动注释的图像上的方法时,我们观察到,与基线SLIC超级像素相比,基于自动编码器的Superpixels的边界F1得分增长了23%。此外,当应用于适应的SLIC和基于自动编码器的Superpixels上时,边界F1评分增加了73%。这些评估表明,对于有效的手动细节进行了预分段的首先结果,而无需初始注释的培训数据。

Supervised deep learning has shown state-of-the-art performance for medical image segmentation across different applications, including histopathology and cancer research; however, the manual annotation of such data is extremely laborious. In this work, we explore the use of superpixel approaches to compute a pre-segmentation of HER2 stained images for breast cancer diagnosis that facilitates faster manual annotation and correction in a second step. Four methods are compared: Standard Simple Linear Iterative Clustering (SLIC) as a baseline, a domain adapted SLIC, and superpixels based on feature embeddings of a pretrained ResNet-50 and a denoising autoencoder. To tackle oversegmentation, we propose to hierarchically merge superpixels, based on their content in the respective feature space. When evaluating the approaches on fully manually annotated images, we observe that the autoencoder-based superpixels achieve a 23% increase in boundary F1 score compared to the baseline SLIC superpixels. Furthermore, the boundary F1 score increases by 73% when hierarchical clustering is applied on the adapted SLIC and the autoencoder-based superpixels. These evaluations show encouraging first results for a pre-segmentation for efficient manual refinement without the need for an initial set of annotated training data.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源