论文标题
ER:知识图完成的均衡规则器
ER: Equivariance Regularizer for Knowledge Graph Completion
论文作者
论文摘要
张量分解和基于距离的模型在知识图完成(KGC)中起重要作用。但是,KGC方法中的关系矩阵通常会引起高模型的复杂性,并具有过度拟合的高风险。作为一种补救措施,研究人员提出了各种不同的正规化器,例如张量核标准正规器。我们的动机基于以下观察,即先前的工作仅着眼于参数空间的“大小”,同时留下隐含的语义信息,并没有受到广泛影响。为了解决这个问题,我们提出了一个新的正常化程序,即均衡规则器(ER),可以通过利用隐式语义信息来抑制过度适应。具体而言,ER可以通过使用头部和尾部实体之间的语义模棱两可来增强模型的概括能力。此外,它是基于距离的模型和基于张量分解模型的通用解决方案。实验结果表明,对最先进的关系预测方法有了明显而实质性的改进。
Tensor factorization and distanced based models play important roles in knowledge graph completion (KGC). However, the relational matrices in KGC methods often induce a high model complexity, bearing a high risk of overfitting. As a remedy, researchers propose a variety of different regularizers such as the tensor nuclear norm regularizer. Our motivation is based on the observation that the previous work only focuses on the "size" of the parametric space, while leaving the implicit semantic information widely untouched. To address this issue, we propose a new regularizer, namely, Equivariance Regularizer (ER), which can suppress overfitting by leveraging the implicit semantic information. Specifically, ER can enhance the generalization ability of the model by employing the semantic equivariance between the head and tail entities. Moreover, it is a generic solution for both distance based models and tensor factorization based models. The experimental results indicate a clear and substantial improvement over the state-of-the-art relation prediction methods.