论文标题

低名​​:链接预测的低级双线性池

LowFER: Low-rank Bilinear Pooling for Link Prediction

论文作者

Amin, Saadullah, Varanasi, Stalin, Dunfield, Katherine Ann, Neumann, Günter

论文摘要

知识图本质上是不完整的,只有数量有限的世界知识被视为实体之间的结构化关系。为了部分解决此问题,统计关系学习中的一项重要任务是链接预测或知识图完成的任务。已经提出了线性和非线性模型来解决该问题。双线性模型虽然表现力,但易于过度拟合并导致关系数量的参数二次增长。更简单的模型已成为更为标准的,在双线性图上作为关系参数的某些约束。在这项工作中,我们提出了一个分解的双线性池模型,通常用于多模式学习,以更好地融合实体和关系,从而导致有效且无限制的模型。我们证明我们的模型具有完全表现力,为嵌入维度和分解等级提供了界限。我们的模型自然概括了基于塔克分解的塔克模型,该模型已被证明可以概括其他模型,这是有效的低级近似值,而不会实质性地损害性能。由于较低的近似值,模型的复杂性可以由分解等级控制,从而避免了塔克的立方生长。从经验上讲,我们对现实世界数据集进行了评估,以达到标准杆或最先进的性能。在极端低级的情况下,模型可以保持性能,同时保持参数有效。

Knowledge graphs are incomplete by nature, with only a limited number of observed facts from the world knowledge being represented as structured relations between entities. To partly address this issue, an important task in statistical relational learning is that of link prediction or knowledge graph completion. Both linear and non-linear models have been proposed to solve the problem. Bilinear models, while expressive, are prone to overfitting and lead to quadratic growth of parameters in number of relations. Simpler models have become more standard, with certain constraints on bilinear map as relation parameters. In this work, we propose a factorized bilinear pooling model, commonly used in multi-modal learning, for better fusion of entities and relations, leading to an efficient and constraint-free model. We prove that our model is fully expressive, providing bounds on the embedding dimensionality and factorization rank. Our model naturally generalizes Tucker decomposition based TuckER model, which has been shown to generalize other models, as efficient low-rank approximation without substantially compromising the performance. Due to low-rank approximation, the model complexity can be controlled by the factorization rank, avoiding the possible cubic growth of TuckER. Empirically, we evaluate on real-world datasets, reaching on par or state-of-the-art performance. At extreme low-ranks, model preserves the performance while staying parameter efficient.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源