论文标题
评估知识图上下文对实体歧义模型的影响
Evaluating the Impact of Knowledge Graph Context on Entity Disambiguation Models
论文作者
论文摘要
预处理的变压器模型已成为最先进的方法,这些方法从文本中学习上下文信息以提高多个NLP任务的性能。这些模型尽管功能强大,但仍需要在特定情况下进行专业知识。在本文中,我们认为从知识图(在我们的情况下:Wikidata)得出的上下文提供了足够的信号,以告知验证的变压器模型并改善其在Wikidata KG上指定的实体歧义(NED)的性能。我们进一步假设我们提出的KG环境可以标准化为Wikipedia,并且我们评估了KG环境对Wikipedia知识库最新模型的影响。我们的经验结果证明了所提出的KG上下文可以被推广(对于Wikipedia),并且在变压器体系结构中提供KG上下文大大优于现有基线,包括香草变压器模型。
Pretrained Transformer models have emerged as state-of-the-art approaches that learn contextual information from text to improve the performance of several NLP tasks. These models, albeit powerful, still require specialized knowledge in specific scenarios. In this paper, we argue that context derived from a knowledge graph (in our case: Wikidata) provides enough signals to inform pretrained transformer models and improve their performance for named entity disambiguation (NED) on Wikidata KG. We further hypothesize that our proposed KG context can be standardized for Wikipedia, and we evaluate the impact of KG context on state-of-the-art NED model for the Wikipedia knowledge base. Our empirical results validate that the proposed KG context can be generalized (for Wikipedia), and providing KG context in transformer architectures considerably outperforms the existing baselines, including the vanilla transformer models.