论文标题
多个低差异任务的在线无参数学习
Online Parameter-Free Learning of Multiple Low Variance Tasks
论文作者
论文摘要
我们提出了一种学习一种常见偏见向量的方法,以制定一系列低变化任务的序列。与最先进的方法不同,我们的方法不需要调整任何高参数。我们的方法是在非统计环境中提出的,可以是两个变体。 “激进”一个人在每个数据点之后更新偏差,“懒惰”只会在每个任务的末尾更新偏差。我们为该方法提供了跨任务遗憾。与最先进的方法相比,积极的变体返回速度更快,懒惰的速度恢复了标准率,但不需要调整超参数。然后,我们将方法调整为统计设置:积极的变体成为多任务学习方法,懒惰的方法是一种元学习方法。实验证实了我们方法在实践中的有效性。
We propose a method to learn a common bias vector for a growing sequence of low-variance tasks. Unlike state-of-the-art approaches, our method does not require tuning any hyper-parameter. Our approach is presented in the non-statistical setting and can be of two variants. The "aggressive" one updates the bias after each datapoint, the "lazy" one updates the bias only at the end of each task. We derive an across-tasks regret bound for the method. When compared to state-of-the-art approaches, the aggressive variant returns faster rates, the lazy one recovers standard rates, but with no need of tuning hyper-parameters. We then adapt the methods to the statistical setting: the aggressive variant becomes a multi-task learning method, the lazy one a meta-learning method. Experiments confirm the effectiveness of our methods in practice.