论文标题

生成网络的压缩学习

Compressive Learning of Generative Networks

论文作者

Schellekens, Vincent, Jacques, Laurent

论文摘要

生成网络以令人印象深刻的精度隐含地近似复杂的密度。但是,由于现代数据集的巨大规模,这种训练过程通常在计算上很昂贵。我们将生成网络培训投入到最新的压缩学习框架中:我们首先在单个传球中以单个草图向量将大规模数据集的计算负担减轻了大规模数据集的计算负担。然后,我们提出了一个成本函数,该函数近似于最大平均差异度量,但仅需要此草图,这使其具有时间和记忆效率来优化。

Generative networks implicitly approximate complex densities from their sampling with impressive accuracy. However, because of the enormous scale of modern datasets, this training process is often computationally expensive. We cast generative network training into the recent framework of compressive learning: we reduce the computational burden of large-scale datasets by first harshly compressing them in a single pass as a single sketch vector. We then propose a cost function, which approximates the Maximum Mean Discrepancy metric, but requires only this sketch, which makes it time- and memory-efficient to optimize.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源