论文标题

游戏角色自动创造的快速而强大的面对参数翻译

Fast and Robust Face-to-Parameter Translation for Game Character Auto-Creation

论文作者

Shi, Tianyang, Zou, Zhengxia, Yuan, Yi, Fan, Changjie

论文摘要

随着角色扮演游戏(RPG)的快速发展,现在允许玩家以自己的喜好编辑游戏中角色的面部外观,而不是使用默认模板。本文提出了一个游戏角色自动创建框架,该框架根据玩家的输入面部照片生成游戏内字符。与基于神经样式转移或单眼3D面重建设计的先前方法不同,我们以不同的角度重新构建了角色自动创造过程:通过预测在自我监督的学习范式下,预测一系列有意义的物理有意义的面部参数。我们没有像以前的方法所建议的那样在渲染器的输入末端迭代更新面部参数,而是耗时的,我们介绍了一个面部参数转换器,以便可以通过从面部嵌入到参数的单个正向传播来有效地完成创建,并具有相当大的1000x计算速度。尽管具有很高的效率,但在我们的方法中仍然保留了交互性,在该方法中,允许用户根据需求选择您创建的面部参数。我们的方法还表现出比以前的方法更好的鲁棒性,尤其是对于那些具有头置差异的照片。对七个公共面部验证数据集的比较结果和消融分析表明了我们方法的有效性。

With the rapid development of Role-Playing Games (RPGs), players are now allowed to edit the facial appearance of their in-game characters with their preferences rather than using default templates. This paper proposes a game character auto-creation framework that generates in-game characters according to a player's input face photo. Different from the previous methods that are designed based on neural style transfer or monocular 3D face reconstruction, we re-formulate the character auto-creation process in a different point of view: by predicting a large set of physically meaningful facial parameters under a self-supervised learning paradigm. Instead of updating facial parameters iteratively at the input end of the renderer as suggested by previous methods, which are time-consuming, we introduce a facial parameter translator so that the creation can be done efficiently through a single forward propagation from the face embeddings to parameters, with a considerable 1000x computational speedup. Despite its high efficiency, the interactivity is preserved in our method where users are allowed to optionally fine-tune the facial parameters on our creation according to their needs. Our approach also shows better robustness than previous methods, especially for those photos with head-pose variance. Comparison results and ablation analysis on seven public face verification datasets suggest the effectiveness of our method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源