论文标题
随机梯度Langevin Dynamics的差异隐私保证
Differential Privacy Guarantees for Stochastic Gradient Langevin Dynamics
论文作者
论文摘要
我们通过用Langevin扩散对RényiDivergence动力学进行建模,分析了嘈杂随机梯度下降的隐私泄漏。受到最新研究的启发,我们在随机环境中得出了类似的理想特性。特别是,我们证明,隐私损失在恒定步长下的平稳且强烈凸的目标呈指数速度收敛,这比以前的DP-SGD分析有了显着改善。我们还将分析扩展到不同步骤大小的任意序列并得出新的效用界限。最后,我们提出了一个实施,我们的实验表明了与经典DP-SGD库相比,我们的方法的实际实用性。
We analyse the privacy leakage of noisy stochastic gradient descent by modeling Rényi divergence dynamics with Langevin diffusions. Inspired by recent work on non-stochastic algorithms, we derive similar desirable properties in the stochastic setting. In particular, we prove that the privacy loss converges exponentially fast for smooth and strongly convex objectives under constant step size, which is a significant improvement over previous DP-SGD analyses. We also extend our analysis to arbitrary sequences of varying step sizes and derive new utility bounds. Last, we propose an implementation and our experiments show the practical utility of our approach compared to classical DP-SGD libraries.