论文标题
关于私人学习无限高维高斯的样本复杂性
On the Sample Complexity of Privately Learning Unbounded High-Dimensional Gaussians
论文作者
论文摘要
我们为在近似差异隐私的限制下提供了不可知的多元高斯的样本复杂性上限。这些是普通高斯人的第一个有限样品上限,对分布的参数施加限制。在已知协方差为身份的情况下,我们的边界几乎是最佳的,在一般情况下,我们的界限是几乎最佳的。从技术角度来看,我们提供了分析工具,以争论该空间本地封面中的全球“本地小”封面的存在。使用最近对私人假设选择的技术修改来利用这些方法。我们的技术可能被证明对于私下学习没有有限封面的其他分销类别很有用。
We provide sample complexity upper bounds for agnostically learning multivariate Gaussians under the constraint of approximate differential privacy. These are the first finite sample upper bounds for general Gaussians which do not impose restrictions on the parameters of the distribution. Our bounds are near-optimal in the case when the covariance is known to be the identity, and conjectured to be near-optimal in the general case. From a technical standpoint, we provide analytic tools for arguing the existence of global "locally small" covers from local covers of the space. These are exploited using modifications of recent techniques for differentially private hypothesis selection. Our techniques may prove useful for privately learning other distribution classes which do not possess a finite cover.