论文标题

使用知识图增强的变压器的跨域方面提取

Cross-Domain Aspect Extraction using Transformers Augmented with Knowledge Graphs

论文作者

Howard, Phillip, Ma, Arden, Lal, Vasudev, Simoes, Ana Paula, Korat, Daniel, Pereg, Oren, Wasserblat, Moshe, Singer, Gadi

论文摘要

方面术语的提取是文本细粒情感分析的关键步骤。当培训和测试数据来自同一领域时,现有的这项任务方法产生了令人印象深刻的结果。但是,当应用于跨域设置时,这些方法显示了测试数据域与训练数据的域不同时的性能急剧下降。为了解决这种缺乏可扩展性和鲁棒性,我们提出了一种新的方法,用于自动构建包含与识别方面术语相关的信息的特定领域知识图。我们介绍了一种将这些知识图的信息注入变压器模型的方法,包括两种知识插入的替代机制:通过查询富集和通过操纵注意力模式。我们在基准数据集上展示了最新的性能,以使用我们的方法进行跨域方面术语提取,并研究了变压器可用的外部知识的数量如何影响模型性能。

The extraction of aspect terms is a critical step in fine-grained sentiment analysis of text. Existing approaches for this task have yielded impressive results when the training and testing data are from the same domain. However, these methods show a drastic decrease in performance when applied to cross-domain settings where the domain of the testing data differs from that of the training data. To address this lack of extensibility and robustness, we propose a novel approach for automatically constructing domain-specific knowledge graphs that contain information relevant to the identification of aspect terms. We introduce a methodology for injecting information from these knowledge graphs into Transformer models, including two alternative mechanisms for knowledge insertion: via query enrichment and via manipulation of attention patterns. We demonstrate state-of-the-art performance on benchmark datasets for cross-domain aspect term extraction using our approach and investigate how the amount of external knowledge available to the Transformer impacts model performance.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源