论文标题

学会多样化产品问题的生成

Learning to Diversify for Product Question Generation

论文作者

Roitman, Haggai, Singer, Uriel, Eshel, Yotam, Nus, Alexander, Kiperwasser, Eliyahu

论文摘要

我们解决产品生成任务。对于给定的产品描述,我们的目标是生成反映潜在用户信息需求的问题,这些需求要么缺少或不涵盖描述中的问题。此外,我们希望涵盖可能跨越多种产品类型的各种用户信息需求。为此,我们首先展示了如何为任务进行微调的T5预训练的变压器编码器模型。然而,尽管与最新的任务方法相比,T5产生的问题具有合理的质量(KPCNET),但许多此类问题仍然太笼统,导致了次优最佳的全球问题多样性。作为替代方案,我们提出了一种新颖的学习多样化(LTD)微调方法,该方法可以丰富基础变压器模型所学的语言。我们的经验评估表明,使用我们的方法可显着提高基本变压器模型的全球多样性,同时尽可能地保持其一代相关性。

We address the product question generation task. For a given product description, our goal is to generate questions that reflect potential user information needs that are either missing or not well covered in the description. Moreover, we wish to cover diverse user information needs that may span a multitude of product types. To this end, we first show how the T5 pre-trained Transformer encoder-decoder model can be fine-tuned for the task. Yet, while the T5 generated questions have a reasonable quality compared to the state-of-the-art method for the task (KPCNet), many of such questions are still too general, resulting in a sub-optimal global question diversity. As an alternative, we propose a novel learning-to-diversify (LTD) fine-tuning approach that allows to enrich the language learned by the underlying Transformer model. Our empirical evaluation shows that, using our approach significantly improves the global diversity of the underlying Transformer model, while preserves, as much as possible, its generation relevance.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源