论文标题
用于短篇小说订购的修剪图神经网络
Pruned Graph Neural Network for Short Story Ordering
论文作者
论文摘要
文本连贯性是自然语言产生和理解中的一个基本问题。将句子组织成最大化连贯性的顺序被称为句子顺序。本文提出了一种基于图形神经网络方法的新方法,以编码一组句子并学习短篇小说的顺序。我们提出了一种新的方法,用于构建短篇小说的句子 - 实体图,以通过用其参考实体替换代词来在句子之间创建句子之间的边缘并减少噪声。我们通过基于最先进方法的多数投票以及我们提议的方法引入聚合方法来改善句子顺序。我们的方法采用基于BERT的模型来学习句子的语义表示。结果表明,在完美匹配比(PMR)和Kendall's Tau(TAU)指标方面,该提出的方法在短篇小说中的现有基准大大优于现有基线。更确切地说,我们的方法将PMR和TAU标准分别提高了5%以上和4.3%。这些结果突出了根据句子的余弦相似性在句子之间形成边缘的好处。我们还观察到,用其参考实体替换代词可以在句子实体图中有效地编码句子。
Text coherence is a fundamental problem in natural language generation and understanding. Organizing sentences into an order that maximizes coherence is known as sentence ordering. This paper is proposing a new approach based on the graph neural network approach to encode a set of sentences and learn orderings of short stories. We propose a new method for constructing sentence-entity graphs of short stories to create the edges between sentences and reduce noise in our graph by replacing the pronouns with their referring entities. We improve the sentence ordering by introducing an aggregation method based on majority voting of state-of-the-art methods and our proposed one. Our approach employs a BERT-based model to learn semantic representations of the sentences. The results demonstrate that the proposed method significantly outperforms existing baselines on a corpus of short stories with a new state-of-the-art performance in terms of Perfect Match Ratio (PMR) and Kendall's Tau (Tau) metrics. More precisely, our method increases PMR and Tau criteria by more than 5% and 4.3%, respectively. These outcomes highlight the benefit of forming the edges between sentences based on their cosine similarity. We also observe that replacing pronouns with their referring entities effectively encodes sentences in sentence-entity graphs.