论文标题
通过典型的对比度学习改善持续的关系提取
Improving Continual Relation Extraction through Prototypical Contrastive Learning
论文作者
论文摘要
持续的关系提取(CRE)旨在使新数据的持续和迭代到来取得关系,主要挑战是灾难性遗忘旧任务。为了减轻这一关键问题以提高CRE性能,我们提出了一个新型的持续关系提取框架,即与对比度学习,即Crecl,即Crecl,该框架是由分类网络和一个原型的对比网络构建的,以实现CRE的增量学习。具体而言,在对比网络中,给定实例与存储在内存模块中的每个候选关系的原型形成对比。这种对比学习方案可确保更加区分所有任务的数据分布,从而减轻灾难性的遗忘。我们的实验结果不仅证明了我们CRECL比两个公共数据集上最先进的基线的优势,而且还验证了Crecl对比度学习对提高CRE性能的有效性。
Continual relation extraction (CRE) aims to extract relations towards the continuous and iterative arrival of new data, of which the major challenge is the catastrophic forgetting of old tasks. In order to alleviate this critical problem for enhanced CRE performance, we propose a novel Continual Relation Extraction framework with Contrastive Learning, namely CRECL, which is built with a classification network and a prototypical contrastive network to achieve the incremental-class learning of CRE. Specifically, in the contrastive network a given instance is contrasted with the prototype of each candidate relations stored in the memory module. Such contrastive learning scheme ensures the data distributions of all tasks more distinguishable, so as to alleviate the catastrophic forgetting further. Our experiment results not only demonstrate our CRECL's advantage over the state-of-the-art baselines on two public datasets, but also verify the effectiveness of CRECL's contrastive learning on improving CRE performance.