论文标题
照片之间的过滤样式转移
Filter Style Transfer between Photos
论文作者
论文摘要
在过去的几年中,图像到图像样式的转移已升至神经图像处理的边界。尽管常规方法在各种任务中取得了成功,例如图像之间的颜色和纹理传输,但没有一个可以有效地与用户通过Instagram等各种平台应用的自定义过滤效果一起使用。在本文中,我们介绍了一种新的样式转移,过滤样式转移(FST)的概念。与常规样式转移不同,新技术FST可以将自定义过滤样式从过滤样式图像提取并传输到内容图像。 FST首先通过图像到图像翻译从过滤的参考文献中注入原始图像。然后,它从它们之间的差异估算过滤器参数。为了解决从参考重建原始图像的不良性质,我们代表图像的每个像素颜色到类均值和偏差。此外,为了处理类内的颜色变化,我们提出了一种基于不确定性的加权最小二平方方法来恢复原始图像。据我们所知,FST是第一种样式转移方法,它可以在移动设备上的2ms以下的FHD图像之间传输自定义过滤器效果,而不会出现任何文本上下文损失。
Over the past few years, image-to-image style transfer has risen to the frontiers of neural image processing. While conventional methods were successful in various tasks such as color and texture transfer between images, none could effectively work with the custom filter effects that are applied by users through various platforms like Instagram. In this paper, we introduce a new concept of style transfer, Filter Style Transfer (FST). Unlike conventional style transfer, new technique FST can extract and transfer custom filter style from a filtered style image to a content image. FST first infers the original image from a filtered reference via image-to-image translation. Then it estimates filter parameters from the difference between them. To resolve the ill-posed nature of reconstructing the original image from the reference, we represent each pixel color of an image to class mean and deviation. Besides, to handle the intra-class color variation, we propose an uncertainty based weighted least square method for restoring an original image. To the best of our knowledge, FST is the first style transfer method that can transfer custom filter effects between FHD image under 2ms on a mobile device without any textual context loss.