论文标题

迅速而肯定:低维知识图嵌入的硬度感知的对比度学习

Swift and Sure: Hardness-aware Contrastive Learning for Low-dimensional Knowledge Graph Embeddings

论文作者

Wang, Kai, Liu, Yu, Sheng, Quan Z.

论文摘要

知识图嵌入(KGE)在自动知识图(KG)完成和知识驱动的任务中显示出巨大的潜力。但是,最近的KGE模型遭受了高训练成本和较大的存储空间的影响,从而限制了它们在现实应用程序中的实用性。为了应对这一挑战,根据对比度学习领域的最新发现,我们提出了一个新颖的KGE培训框架,称为“硬度了解的低维嵌入”(Hale)。我们不是传统的负抽样,而是根据查询采样设计了一个新的损失函数,可以平衡两个重要的训练目标,一致性和均匀性。此外,我们分析了最近的低维双曲线模型的硬度感知能力,并提出了一种轻巧的硬度感知激活机制。实验结果表明,在有限的训练时间内,黑尔可以有效地提高五个常用数据集上KGE模型的性能和训练速度。在训练了几分钟之后,与最先进的模型相比,在低维条件下,受Hale训练的模型具有竞争力。

Knowledge graph embedding (KGE) has shown great potential in automatic knowledge graph (KG) completion and knowledge-driven tasks. However, recent KGE models suffer from high training cost and large storage space, thus limiting their practicality in real-world applications. To address this challenge, based on the latest findings in the field of Contrastive Learning, we propose a novel KGE training framework called Hardness-aware Low-dimensional Embedding (HaLE). Instead of the traditional Negative Sampling, we design a new loss function based on query sampling that can balance two important training targets, Alignment and Uniformity. Furthermore, we analyze the hardness-aware ability of recent low-dimensional hyperbolic models and propose a lightweight hardness-aware activation mechanism. The experimental results show that in the limited training time, HaLE can effectively improve the performance and training speed of KGE models on five commonly-used datasets. After training just a few minutes, the HaLE-trained models are competitive compared to the state-of-the-art models in both low- and high-dimensional conditions.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源