论文标题

探索对抗性VR游戏设计的隐私风险

Exploring the Privacy Risks of Adversarial VR Game Design

论文作者

Nair, Vivek, Garrido, Gonzalo Munilla, Song, Dawn, O'Brien, James F.

论文摘要

五十名研究参与者在虚拟现实(VR)中播放了一个看起来无辜的“逃生室”游戏。在短短几分钟之内,一个对抗性计划就准确地推断了其25个个人数据属性,从高度和翼展等人类计量学到年龄和性别等人口统计学。众所周知,渴望数据的公司越来越多地参与VR开发,这种实验场景可能很快代表了典型的VR用户体验。自2018年剑桥分析丑闻以来,众所周知,对抗性的游戏元素在传统的社交平台上构成了巨大的隐私威胁。在这项工作中,我们介绍了一个案例研究,介绍了如何同样地构建元环境,以秘密地从看似匿名的用户中推断出数十个个人数据属性。尽管现有的VR隐私研究在很大程度上集中在被动观察上,但我们认为,由于个人在响应特定刺激的情况下下意识地揭示了个人信息,因此主动攻击在VR环境中构成了巨大的风险。

Fifty study participants playtested an innocent-looking "escape room" game in virtual reality (VR). Within just a few minutes, an adversarial program had accurately inferred over 25 of their personal data attributes, from anthropometrics like height and wingspan to demographics like age and gender. As notoriously data-hungry companies become increasingly involved in VR development, this experimental scenario may soon represent a typical VR user experience. Since the Cambridge Analytica scandal of 2018, adversarially designed gamified elements have been known to constitute a significant privacy threat in conventional social platforms. In this work, we present a case study of how metaverse environments can similarly be adversarially constructed to covertly infer dozens of personal data attributes from seemingly anonymous users. While existing VR privacy research largely focuses on passive observation, we argue that because individuals subconsciously reveal personal information via their motion in response to specific stimuli, active attacks pose an outsized risk in VR environments.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源