论文标题

评估对神经机器翻译的输入扰动的鲁棒性

Evaluating Robustness to Input Perturbations for Neural Machine Translation

论文作者

Niu, Xing, Mathur, Prashant, Dinu, Georgiana, Al-Onaizan, Yaser

论文摘要

神经机器翻译(NMT)模型对输入中的小扰动敏感。通常使用翻译质量指标(例如噪声输入上的BLEU)来测量这种扰动的鲁棒性。本文提出了其他指标,这些指标衡量了将小扰动添加到输入中时的相对降解和翻译变化。我们专注于采用子词正则化来解决鲁棒性的一类模型,并使用提出的鲁棒性措施对这些模型进行了广泛的评估。结果表明,我们提出的指标揭示了使用子词正则化方法时的明确趋势对扰动的鲁棒性提高了。

Neural Machine Translation (NMT) models are sensitive to small perturbations in the input. Robustness to such perturbations is typically measured using translation quality metrics such as BLEU on the noisy input. This paper proposes additional metrics which measure the relative degradation and changes in translation when small perturbations are added to the input. We focus on a class of models employing subword regularization to address robustness and perform extensive evaluations of these models using the robustness measures proposed. Results show that our proposed metrics reveal a clear trend of improved robustness to perturbations when subword regularization methods are used.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源