论文标题
使用属性依赖性生成对抗网络的突出属性修改
Prominent Attribute Modification using Attribute Dependent Generative Adversarial Network
论文作者
论文摘要
使用所需属性修改面部图像很重要,尽管计算机视觉中的任务具有挑战性,它旨在修改面部图像的单个或多个属性。某些现有方法要么基于属性独立方法,其中修改是在潜在表示或属性依赖性方法中进行的。属性独立方法的性能受到限制,因为它们需要所需的配对数据以更改所需的属性。其次,属性独立约束可能会导致信息丢失,因此无法在面部图像中生成所需的属性。相比之下,属性依赖性方法是有效的,因为这些方法能够修改所需的功能,并保留在给定图像中的信息。但是,属性依赖性方法是敏感的,需要仔细的模型设计来产生高质量的结果。为了解决这个问题,我们提出了一种属性依赖的面部修改方法。所提出的方法基于使用二进制的两个生成器和两个鉴别器,以及属性的实际表示,并回报产生高质量的属性修改结果。 Celeba数据集的实验表明,我们的方法有效地执行了多个属性编辑,并完整地保留了其他面部细节。
Modifying the facial images with desired attributes is important, though challenging tasks in computer vision, where it aims to modify single or multiple attributes of the face image. Some of the existing methods are either based on attribute independent approaches where the modification is done in the latent representation or attribute dependent approaches. The attribute independent methods are limited in performance as they require the desired paired data for changing the desired attributes. Secondly, the attribute independent constraint may result in the loss of information and, hence, fail in generating the required attributes in the face image. In contrast, the attribute dependent approaches are effective as these approaches are capable of modifying the required features along with preserving the information in the given image. However, attribute dependent approaches are sensitive and require a careful model design in generating high-quality results. To address this problem, we propose an attribute dependent face modification approach. The proposed approach is based on two generators and two discriminators that utilize the binary as well as the real representation of the attributes and, in return, generate high-quality attribute modification results. Experiments on the CelebA dataset show that our method effectively performs the multiple attribute editing with preserving other facial details intactly.