论文标题

组成网络启用系统的概括,以了解基础语言理解

Compositional Networks Enable Systematic Generalization for Grounded Language Understanding

论文作者

Kuo, Yen-Ling, Katz, Boris, Barbu, Andrei

论文摘要

当了解包括以前从未遇到过的概念组合的新句子时,人类非常灵活。最近的工作表明,尽管深层网络可以模仿一些人类语言能力,但在呈现新句子时,系统的变化会发现网络的语言能力的局限性。我们证明,可以通过解决GSCAN数据集的概括挑战来克服这些局限性,该数据集明确衡量了代理人能够解释基于视觉的新型语言命令,例如形容词和名词的新颖配对。我们采用的关键原则是组成性:网络的组成结构应反映其解决问题域的组成结构,同时允许端对端学习其他参数。我们建立了一种通用机制,使代理商能够将其语言理解推广到组成领域。至关重要的是,我们的网络具有与先前的工作相同的最先进的表现,同时在先前的工作没有的情况下概括了知识。我们的网络还提供了一定程度的解释性,使用户可以检查网络所学的每个部分。强大的基础语言理解而没有戏剧性的失败,而没有角落案件对于建立安全且公平的机器人至关重要;我们证明了构图可以在实现这一目标中发挥的重要作用。

Humans are remarkably flexible when understanding new sentences that include combinations of concepts they have never encountered before. Recent work has shown that while deep networks can mimic some human language abilities when presented with novel sentences, systematic variation uncovers the limitations in the language-understanding abilities of networks. We demonstrate that these limitations can be overcome by addressing the generalization challenges in the gSCAN dataset, which explicitly measures how well an agent is able to interpret novel linguistic commands grounded in vision, e.g., novel pairings of adjectives and nouns. The key principle we employ is compositionality: that the compositional structure of networks should reflect the compositional structure of the problem domain they address, while allowing other parameters to be learned end-to-end. We build a general-purpose mechanism that enables agents to generalize their language understanding to compositional domains. Crucially, our network has the same state-of-the-art performance as prior work while generalizing its knowledge when prior work does not. Our network also provides a level of interpretability that enables users to inspect what each part of networks learns. Robust grounded language understanding without dramatic failures and without corner cases is critical to building safe and fair robots; we demonstrate the significant role that compositionality can play in achieving that goal.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源