论文标题

binplay:用于生成重播持续学习的二进制潜在自动编码器

BinPlay: A Binary Latent Autoencoder for Generative Replay Continual Learning

论文作者

Deja, Kamil, Wawrzyński, Paweł, Marczak, Daniel, Masarczyk, Wojciech, Trzciński, Tomasz

论文摘要

我们介绍了二进制潜在空间自动编码器体系结构,以排练培训样本,以持续学习神经网络。通过新数据扩展模型知识的能力而不忘记以前学习的样本是持续学习的基本要求。现有的解决方案通过重现记忆的过去数据来解决它,这对于不断增长的培训数据是不可持续的,或者是通过使用生成模型来重建过去的样本,这些模型经过训练,可以推广到训练数据之外,因此错过了单个样本的重要细节。在本文中,我们竭尽所能,并引入了一种名为Binplay的新颖生成彩排方法。它的主要目的是将过去样本的质量保留编码为居住在自动编码器的二进制潜在空间中的预先计算的二进制代码。由于我们仅在训练样本的年代索引上对代码进行预先计算的公式,因此自动编码器能够即时计算排练样品的二进制嵌入,而无需将其保留在记忆中。在三个基准数据集上的评估显示了Binplay与竞争性生成重播方法的两倍精度提高。

We introduce a binary latent space autoencoder architecture to rehearse training samples for the continual learning of neural networks. The ability to extend the knowledge of a model with new data without forgetting previously learned samples is a fundamental requirement in continual learning. Existing solutions address it by either replaying past data from memory, which is unsustainable with growing training data, or by reconstructing past samples with generative models that are trained to generalize beyond training data and, hence, miss important details of individual samples. In this paper, we take the best of both worlds and introduce a novel generative rehearsal approach called BinPlay. Its main objective is to find a quality-preserving encoding of past samples into precomputed binary codes living in the autoencoder's binary latent space. Since we parametrize the formula for precomputing the codes only on the chronological indices of the training samples, the autoencoder is able to compute the binary embeddings of rehearsed samples on the fly without the need to keep them in memory. Evaluation on three benchmark datasets shows up to a twofold accuracy improvement of BinPlay versus competing generative replay methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源