论文标题

GAN培训中的数据节食

Data Dieting in GAN Training

论文作者

Toutouh, Jamal, O'Reilly, Una-May, Hemberg, Erik

论文摘要

我们研究培训生成的对抗网络,gan,数据较少。培训数据集的子集可以表达经验样本多样性,同时减少培训资源要求,例如时间和记忆。我们询问减少数据会影响发电机性能并衡量发电机集合的添加价值。除了考虑发电机模型的独立GAN训练和集合外,我们还考虑了在名为Redux-Lipizzaner的进化GAN训练框架上减少数据培训。 Redux-Lipizzaner通过在空间2D网格上利用基于重叠的社区训练,使GAN训练更加稳健和准确。我们使用MNIST和CELEBA数据集在Redux-Lipizzaner上进行经验实验。

We investigate training Generative Adversarial Networks, GANs, with less data. Subsets of the training dataset can express empirical sample diversity while reducing training resource requirements, e.g. time and memory. We ask how much data reduction impacts generator performance and gauge the additive value of generator ensembles. In addition to considering stand-alone GAN training and ensembles of generator models, we also consider reduced data training on an evolutionary GAN training framework named Redux-Lipizzaner. Redux-Lipizzaner makes GAN training more robust and accurate by exploiting overlapping neighborhood-based training on a spatial 2D grid. We conduct empirical experiments on Redux-Lipizzaner using the MNIST and CelebA data sets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源