论文标题

通过时间感知预培训来改善事件持续时间预测

Improving Event Duration Prediction via Time-aware Pre-training

论文作者

Yang, Zonglin, Du, Xinya, Rush, Alexander, Cardie, Claire

论文摘要

NLP中的端到端模型很少编码有关时间长度的外部世界知识。我们介绍了两个有效的持续时间预测模型,这些模型通过阅读与时间相关的新闻句子(时间了解预训练)来结合外部知识。具体而言,一个模型可以预测持续时间值(r-pred)的范围/单位;另一个可以预测确切的持续时间e-pred。我们的最佳模型 - 电子预示,比以前的工作大大胜过以前的工作,并且比R-Pred更准确地捕获了持续时间信息。我们还证明,我们的模型能够在无监督的环境中持续预测,表现优于基准。

End-to-end models in NLP rarely encode external world knowledge about length of time. We introduce two effective models for duration prediction, which incorporate external knowledge by reading temporal-related news sentences (time-aware pre-training). Specifically, one model predicts the range/unit where the duration value falls in (R-pred); and the other predicts the exact duration value E-pred. Our best model -- E-pred, substantially outperforms previous work, and captures duration information more accurately than R-pred. We also demonstrate our models are capable of duration prediction in the unsupervised setting, outperforming the baselines.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源