论文标题
生成具有直接补丁分布匹配的自然图像
Generating natural images with direct Patch Distributions Matching
论文作者
论文摘要
许多传统的计算机视觉算法通过要求生成图像中的每个贴片类似于训练图像中的补丁,反之亦然。最近,这种经典的方法已被补丁歧视器所取代。对抗方法避免了找到最近的贴片邻居的计算负担,但通常需要很长的训练时间,并且可能无法匹配贴片的分布。在本文中,我们利用了最近开发的切成薄片的Wasserstein距离,并开发了一种算法,该算法可以明确有效地最大程度地减少两个图像中斑块分布之间的距离。我们的方法在概念上很简单,不需要培训,并且可以在几行代码中实现。在许多图像生成任务上,我们表明我们的结果通常优于单片图像甘奇,不需要培训,并且可以在几秒钟内生成高质量的图像。我们的实施可从https://github.com/ariel415el/gpdm获得
Many traditional computer vision algorithms generate realistic images by requiring that each patch in the generated image be similar to a patch in a training image and vice versa. Recently, this classical approach has been replaced by adversarial training with a patch discriminator. The adversarial approach avoids the computational burden of finding nearest neighbors of patches but often requires very long training times and may fail to match the distribution of patches. In this paper we leverage the recently developed Sliced Wasserstein Distance and develop an algorithm that explicitly and efficiently minimizes the distance between patch distributions in two images. Our method is conceptually simple, requires no training and can be implemented in a few lines of codes. On a number of image generation tasks we show that our results are often superior to single-image-GANs, require no training, and can generate high quality images in a few seconds. Our implementation is available at https://github.com/ariel415el/GPDM