论文标题

实时普遍影子风格转移的联合双边学习

Joint Bilateral Learning for Real-time Universal Photorealistic Style Transfer

论文作者

Xia, Xide, Zhang, Meng, Xue, Tianfan, Sun, Zheng, Fang, Hui, Kulis, Brian, Chen, Jiawen

论文摘要

逼真的样式转移是将图像的艺术风格转移到内容目标上的任务,从而产生了与相机合理拍摄的结果。基于深层神经网络的最新方法会产生令人印象深刻的结果,但要么太慢,无法在实践决议下运行,要么仍然包含令人反感的伪像。我们提出了一个新的倒数端到端模型,用于逼真的风格转移,既快速又固有地产生了逼真的结果。我们方法的核心是一个馈送前向神经网络,该网络学习局部边缘感知的仿射会自动遵守光真相约束。当经过各种图像和各种样式的培训时,我们的模型可以稳健地将样式转移应用于任意的输入图像。与艺术的状态相比,我们的方法在视觉上产生了出色的结果,并且更快的速度是三个数量级,可以在手机上4K以实时性能。我们通过消融和用户研究来验证我们的方法。

Photorealistic style transfer is the task of transferring the artistic style of an image onto a content target, producing a result that is plausibly taken with a camera. Recent approaches, based on deep neural networks, produce impressive results but are either too slow to run at practical resolutions, or still contain objectionable artifacts. We propose a new end-to-end model for photorealistic style transfer that is both fast and inherently generates photorealistic results. The core of our approach is a feed-forward neural network that learns local edge-aware affine transforms that automatically obey the photorealism constraint. When trained on a diverse set of images and a variety of styles, our model can robustly apply style transfer to an arbitrary pair of input images. Compared to the state of the art, our method produces visually superior results and is three orders of magnitude faster, enabling real-time performance at 4K on a mobile phone. We validate our method with ablation and user studies.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源