论文标题

差异隐私对群体差异缓解的影响

The Impact of Differential Privacy on Group Disparity Mitigation

论文作者

Hansen, Victor Petrén Bach, Neerkaje, Atula Tejaswi, Sawhney, Ramit, Flek, Lucie, Søgaard, Anders

论文摘要

对于某些应用,差异隐私的性能成本已被证明对少数群体群体较高。相反,公平性已被证明会不成比例地损害了此类群体成员的隐私。该领域的大多数工作都仅限于计算机视觉和风险评估。在本文中,我们评估了差异隐私对四个任务中公平性的影响,重点是试图减轻隐私侵犯隐私的尝试和组之间的绩效差异相互作用:隐私会抑制尝试确保公平性的尝试吗?为此,我们训练$(\ Varepsilon,δ)$ - 具有经验风险最小化和组分配良好的培训目标的差异私有模型。与以前的发现一致,我们发现差异隐私会增加基线设置中的组间绩效差异。但更有趣的是,差异隐私会减少在健壮环境中的组间绩效差异。我们通过将差异隐私作为正则化来解释这一点。

The performance cost of differential privacy has, for some applications, been shown to be higher for minority groups; fairness, conversely, has been shown to disproportionally compromise the privacy of members of such groups. Most work in this area has been restricted to computer vision and risk assessment. In this paper, we evaluate the impact of differential privacy on fairness across four tasks, focusing on how attempts to mitigate privacy violations and between-group performance differences interact: Does privacy inhibit attempts to ensure fairness? To this end, we train $(\varepsilon,δ)$-differentially private models with empirical risk minimization and group distributionally robust training objectives. Consistent with previous findings, we find that differential privacy increases between-group performance differences in the baseline setting; but more interestingly, differential privacy reduces between-group performance differences in the robust setting. We explain this by reinterpreting differential privacy as regularization.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源