论文标题
3D点云理解的无数据增强无监督学习
Data Augmentation-free Unsupervised Learning for 3D Point Cloud Understanding
论文作者
论文摘要
3D点云上的无监督学习经历了快速发展,尤其是由于基于数据增强的对比方法。但是,数据增强不是理想的选择,因为它需要仔细选择要执行的增强类型,进而会影响网络在自我训练期间学习的几何和语义信息。为了克服这个问题,我们为点云提出了一种通过软聚类(命名softclu)来学习可转移点级特征的无监督方法。 SoftClu假设属于群集的点应该在几何和特征空间中彼此靠近。这与典型的对比学习不同,这为整个点云及其增强版本构建了相似的表示。我们利用点与群集的隶属关系作为代理,以通过伪标记的预测任务来实现自我训练。在这些伪标签诱导点云的平选方面的限制下,我们将SoftClu作为最佳运输问题。我们制定了无监督的损失,以最大程度地减少伪标签和预测标签之间的标准跨凝结。在下游应用程序(例如3D对象分类,零件分割和语义分割)上进行的实验显示了我们框架在胜过最先进技术方面的有效性。
Unsupervised learning on 3D point clouds has undergone a rapid evolution, especially thanks to data augmentation-based contrastive methods. However, data augmentation is not ideal as it requires a careful selection of the type of augmentations to perform, which in turn can affect the geometric and semantic information learned by the network during self-training. To overcome this issue, we propose an augmentation-free unsupervised approach for point clouds to learn transferable point-level features via soft clustering, named SoftClu. SoftClu assumes that the points belonging to a cluster should be close to each other in both geometric and feature spaces. This differs from typical contrastive learning, which builds similar representations for a whole point cloud and its augmented versions. We exploit the affiliation of points to their clusters as a proxy to enable self-training through a pseudo-label prediction task. Under the constraint that these pseudo-labels induce the equipartition of the point cloud, we cast SoftClu as an optimal transport problem. We formulate an unsupervised loss to minimize the standard cross-entropy between pseudo-labels and predicted labels. Experiments on downstream applications, such as 3D object classification, part segmentation, and semantic segmentation, show the effectiveness of our framework in outperforming state-of-the-art techniques.