论文标题
非自动性翻译的上下文感知跨注意事项
Context-Aware Cross-Attention for Non-Autoregressive Translation
论文作者
论文摘要
非自动性翻译(NAT)通过预测整个目标序列可以显着加速推理过程。但是,由于解码器缺乏目标依赖性建模,条件生成过程在很大程度上取决于交叉注意。在本文中,我们揭示了NAT跨注意事项中的局部感知问题,为此很难充分捕获源环境。为了减轻这个问题,我们建议将邻居信号代币的信号提高到常规的跨注意事项中。几个代表性数据集的实验结果表明,我们的方法可以始终如一地改善强大的NAT基准的翻译质量。广泛的分析表明,增强的交叉发音通过利用本地和全球信息来更好地利用来源环境。
Non-autoregressive translation (NAT) significantly accelerates the inference process by predicting the entire target sequence. However, due to the lack of target dependency modelling in the decoder, the conditional generation process heavily depends on the cross-attention. In this paper, we reveal a localness perception problem in NAT cross-attention, for which it is difficult to adequately capture source context. To alleviate this problem, we propose to enhance signals of neighbour source tokens into conventional cross-attention. Experimental results on several representative datasets show that our approach can consistently improve translation quality over strong NAT baselines. Extensive analyses demonstrate that the enhanced cross-attention achieves better exploitation of source contexts by leveraging both local and global information.