论文标题

插值方面的私人优化:更快的速度和硬度结果

Private optimization in the interpolation regime: faster rates and hardness results

论文作者

Asi, Hilal, Chadha, Karan, Cheng, Gary, Duchi, John

论文摘要

在非私有的随机凸优化中,随机梯度方法在插值问题上的收敛速度要快得多 - 存在同时使所有样本损失的解决方案的问题要比非互动的解决方案最小化。我们表明,在私人环境中通常不可能进行类似的改进。但是,当功能在最佳量周围表现出二次增长时,我们显示(近)私人样品复杂性的指数改善。特别是,我们提出了一种自适应算法,该算法将样本复杂性提高,以达到预期错误$α$从$ \ frac {d} {\ varepsilon \ varepsilon \sqrtα} $到$ \ \ \ \ frac {1} {1} {α^p} \ log \ left(\ frac {1}α\ right)$对于任何固定的$ρ> 0 $,同时保留了非交互问题的标准最小值样本复杂性。我们证明了一个显示尺寸依赖性项的下限是紧密的。此外,我们提供了一个超级效率的结果,该结果证明了自适应算法的多项式项的必要性:任何具有用于插值问题的多属性样品复杂性的算法无法实现非促进问题的家族的最小值 - 最佳速率。

In non-private stochastic convex optimization, stochastic gradient methods converge much faster on interpolation problems -- problems where there exists a solution that simultaneously minimizes all of the sample losses -- than on non-interpolating ones; we show that generally similar improvements are impossible in the private setting. However, when the functions exhibit quadratic growth around the optimum, we show (near) exponential improvements in the private sample complexity. In particular, we propose an adaptive algorithm that improves the sample complexity to achieve expected error $α$ from $\frac{d}{\varepsilon \sqrtα}$ to $\frac{1}{α^ρ} + \frac{d}{\varepsilon} \log\left(\frac{1}α\right)$ for any fixed $ρ>0$, while retaining the standard minimax-optimal sample complexity for non-interpolation problems. We prove a lower bound that shows the dimension-dependent term is tight. Furthermore, we provide a superefficiency result which demonstrates the necessity of the polynomial term for adaptive algorithms: any algorithm that has a polylogarithmic sample complexity for interpolation problems cannot achieve the minimax-optimal rates for the family of non-interpolation problems.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源