论文标题
DSC IIT-IT-2020任务6:用依赖关系提取定义提取的伯特
DSC IIT-ISM at SemEval-2020 Task 6: Boosting BERT with Dependencies for Definition Extraction
论文作者
论文摘要
我们在定义提取时探索了来自变压器(BERT)的双向编码器表示的性能。我们进一步提出了BERT和文本级别图形卷积网络的联合模型,以将依赖项纳入模型。我们提出的模型比BERT产生更好的结果,并在Defteval中使用微调语言模型(Semeval 2020的任务6)取得了可比的结果,这是分类句子是否包含定义的共同任务(subtask 1)。
We explore the performance of Bidirectional Encoder Representations from Transformers (BERT) at definition extraction. We further propose a joint model of BERT and Text Level Graph Convolutional Network so as to incorporate dependencies into the model. Our proposed model produces better results than BERT and achieves comparable results to BERT with fine tuned language model in DeftEval (Task 6 of SemEval 2020), a shared task of classifying whether a sentence contains a definition or not (Subtask 1).