论文标题

迭代批次反向翻译神经机器翻译:概念模型

Iterative Batch Back-Translation for Neural Machine Translation: A Conceptual Model

论文作者

Abdulmumin, Idris, Galadanci, Bashir Shehu, Isa, Abubakar

论文摘要

一种有效的方法来生成大量并行句子进行训练改进的神经机器翻译(NMT)系统的方法是使用目标侧单语言数据的背面翻译。最近,尽管在某些语言对上,但已显示迭代反向翻译表现优于标准的背面翻译。这项工作提出了迭代批次反向翻译,旨在增强标准迭代的反向翻译,并有效利用更多单语言数据。每次迭代后,将改进的新句子的反向翻译添加到并行数据中,该数据将用于训练最终的正向模型。这项工作提出了拟议方法的概念模型。

An effective method to generate a large number of parallel sentences for training improved neural machine translation (NMT) systems is the use of back-translations of the target-side monolingual data. Recently, iterative back-translation has been shown to outperform standard back-translation albeit on some language pairs. This work proposes the iterative batch back-translation that is aimed at enhancing the standard iterative back-translation and enabling the efficient utilization of more monolingual data. After each iteration, improved back-translations of new sentences are added to the parallel data that will be used to train the final forward model. The work presents a conceptual model of the proposed approach.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源