论文标题
基于能量的持续学习模型
Energy-Based Models for Continual Learning
论文作者
论文摘要
我们激励基于能量的模型(EBM)作为持续学习问题的有前途的模型类。 EBM并没有通过外部记忆,增长模型或正则化来解决持续学习,而是改变了基本训练目标,以减少对以前学习的信息的干扰。我们提出的用于持续学习的EBM的版本简单,高效,并且在几个基准测试上的优于基线方法。此外,我们提出的基于差异的训练目标可以与其他持续学习方法相结合,从而大大提高其性能。我们进一步表明,EBM可适应一个更通用的持续学习设置,在该设置中,数据分布在没有明确划定的任务的情况下变化。这些观察结果将EBM指向未来持续学习方法的有用构件。
We motivate Energy-Based Models (EBMs) as a promising model class for continual learning problems. Instead of tackling continual learning via the use of external memory, growing models, or regularization, EBMs change the underlying training objective to cause less interference with previously learned information. Our proposed version of EBMs for continual learning is simple, efficient, and outperforms baseline methods by a large margin on several benchmarks. Moreover, our proposed contrastive divergence-based training objective can be combined with other continual learning methods, resulting in substantial boosts in their performance. We further show that EBMs are adaptable to a more general continual learning setting where the data distribution changes without the notion of explicitly delineated tasks. These observations point towards EBMs as a useful building block for future continual learning methods.