论文标题
为图对比度学习生成反事实的硬性负面样本
Generating Counterfactual Hard Negative Samples for Graph Contrastive Learning
论文作者
论文摘要
图对比度学习已成为无监督图表学习的强大工具。图形对比学习成功的关键是获取高质量的正和负样本作为对比对,以学习输入图的基本结构语义。最近的作品通常从同一训练批次的阳性样品或外部无关图中采样负样品。但是,一个重要的局限性在于此类策略,这是对假阴性样本进行采样的不可避免的问题。在本文中,我们提出了一种新的方法来利用\ textbf {c} ounternfactual机制来生成\ textbf {g} raph \ textbf {c} ontrastive学习的人造硬性样本,而不是\ {cgc}与这些样本相比,与这些样本相比,这些策略是不同的。我们利用反事实机制来产生硬性样品,从而确保生成的样品与相似,但具有与正样品不同的标签。与某些传统的无监督图学习方法和一些SOTA图对比度学习方法相比,所提出的方法在几个数据集上实现了令人满意的结果。我们还进行了一些补充实验,为提出的方法提供了广泛的说明,包括具有不同硬性样本的CGC的性能以及对具有不同相似性测量的硬性阴性样品的评估。
Graph contrastive learning has emerged as a powerful tool for unsupervised graph representation learning. The key to the success of graph contrastive learning is to acquire high-quality positive and negative samples as contrasting pairs for the purpose of learning underlying structural semantics of the input graph. Recent works usually sample negative samples from the same training batch with the positive samples, or from an external irrelevant graph. However, a significant limitation lies in such strategies, which is the unavoidable problem of sampling false negative samples. In this paper, we propose a novel method to utilize \textbf{C}ounterfactual mechanism to generate artificial hard negative samples for \textbf{G}raph \textbf{C}ontrastive learning, namely \textbf{CGC}, which has a different perspective compared to those sampling-based strategies. We utilize counterfactual mechanism to produce hard negative samples, which ensures that the generated samples are similar to, but have labels that different from the positive sample. The proposed method achieves satisfying results on several datasets compared to some traditional unsupervised graph learning methods and some SOTA graph contrastive learning methods. We also conduct some supplementary experiments to give an extensive illustration of the proposed method, including the performances of CGC with different hard negative samples and evaluations for hard negative samples generated with different similarity measurements.