论文标题
通过基于正交性的概率损失,无监督的深度度量学习
Unsupervised Deep Metric Learning via Orthogonality based Probabilistic Loss
论文作者
论文摘要
公制学习是机器学习中的重要问题。它旨在将类似的示例分组在一起。现有的最先进的指标学习方法需要课堂标签来学习指标。由于在所有应用程序中获得类标签都是不可行的,因此我们提出了一种无监督的方法,该方法在不使用类标签的情况下学习了指标。缺乏类标签是通过使用基于图的聚类方法获得数据标签来补偿的。伪标签用于形成指导度量学习的示例的三联体。我们提出了一种概率损失,可最大程度地减少每个三胞胎违反角度约束的机会。重量函数和客观中的正交性约束可以提高收敛性并避免模型崩溃。我们还提供了一种随机配方,以扩展到大规模数据集。我们的研究表明,我们的方法与最先进的方法的竞争力。我们还彻底研究了方法的不同组成部分的效果。
Metric learning is an important problem in machine learning. It aims to group similar examples together. Existing state-of-the-art metric learning approaches require class labels to learn a metric. As obtaining class labels in all applications is not feasible, we propose an unsupervised approach that learns a metric without making use of class labels. The lack of class labels is compensated by obtaining pseudo-labels of data using a graph-based clustering approach. The pseudo-labels are used to form triplets of examples, which guide the metric learning. We propose a probabilistic loss that minimizes the chances of each triplet violating an angular constraint. A weight function, and an orthogonality constraint in the objective speeds up the convergence and avoids a model collapse. We also provide a stochastic formulation of our method to scale up to large-scale datasets. Our studies demonstrate the competitiveness of our approach against state-of-the-art methods. We also thoroughly study the effect of the different components of our method.