论文标题

神经对话世代的词汇知识内在化

Lexical Knowledge Internalization for Neural Dialog Generation

论文作者

Wu, Zhiyong, Bi, Wei, Li, Xiang, Kong, Lingpeng, Kao, Ben

论文摘要

我们提出了知识内部化(KI),旨在将词汇知识补充为神经对话模型。我们试图将有关每个输入令牌的知识集成到模型的参数中,而不是进一步调节知识接地对话框(kgd)模型。为了应对大规模的词汇知识,我们采用对比度学习方法,并创建一个有效的令牌级词汇知识检索器,这仅需要从Wikipedia开采的薄弱监督。我们证明了我们的方法在各种数据集和多元化模型结构上的有效性和一般适用性。

We propose knowledge internalization (KI), which aims to complement the lexical knowledge into neural dialog models. Instead of further conditioning the knowledge-grounded dialog (KGD) models on externally retrieved knowledge, we seek to integrate knowledge about each input token internally into the model's parameters. To tackle the challenge due to the large scale of lexical knowledge, we adopt the contrastive learning approach and create an effective token-level lexical knowledge retriever that requires only weak supervision mined from Wikipedia. We demonstrate the effectiveness and general applicability of our approach on various datasets and diversified model structures.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源