论文标题
零最糟糕的课程学习和很少的射击转移
Worst-Case-Aware Curriculum Learning for Zero and Few Shot Transfer
论文作者
论文摘要
基于预训练的语言编码的多任务转移学习在一系列任务中实现了最先进的表现。标准方法隐含地假设我们拥有培训数据的任务同样代表了我们感兴趣的任务,这通常很难证明这一假设。本文为多任务转移学习提供了一种更不可知论的方法,该方法使用自动化课程学习来最大程度地减少一个新任务中最糟糕的感知损失的新家庭。这些损失不仅会导致在越来越多的任务上的表现更好。它们还可以在零射门和几乎没有传输的转移设置中提高性能。
Multi-task transfer learning based on pre-trained language encoders achieves state-of-the-art performance across a range of tasks. Standard approaches implicitly assume the tasks, for which we have training data, are equally representative of the tasks we are interested in, an assumption which is often hard to justify. This paper presents a more agnostic approach to multi-task transfer learning, which uses automated curriculum learning to minimize a new family of worst-case-aware losses across tasks. Not only do these losses lead to better performance on outlier tasks; they also lead to better performance in zero-shot and few-shot transfer settings.