论文标题
通过$α$ - 流浪信息的强大概括
Robust Generalization via $α$-Mutual Information
论文作者
论文摘要
这项工作的目的是提供使用Rényi$α$ -DIVERGENCES和SIBSON的$α$ - 潮流信息连接同一事件的两个概率度量的界限,分别对Kullback-Leibler Divergence和Shannon的互相信息进行了概括。当考虑的两个概率措施是关节分布和边缘的相应乘积(代表统计上独立的方案)时,可以找到一种特定的感兴趣情况。在这种情况下,检索了使用Sibson $α-$共同信息的界限,从而扩大了涉及一般字母的最大泄漏的结果。这些结果具有广泛的应用,从学习学习算法的概括误差到自适应数据分析的更通用框架,前提是使用的差异和/或信息测量值适用于这种分析({\ it i i.e.,}是强大的,可以自适应地进行后处理和自适应)。相对于高概率事件而得出了概括误差界限,但还检索了对预期概括误差的相应结合。
The aim of this work is to provide bounds connecting two probability measures of the same event using Rényi $α$-Divergences and Sibson's $α$-Mutual Information, a generalization of respectively the Kullback-Leibler Divergence and Shannon's Mutual Information. A particular case of interest can be found when the two probability measures considered are a joint distribution and the corresponding product of marginals (representing the statistically independent scenario). In this case, a bound using Sibson's $α-$Mutual Information is retrieved, extending a result involving Maximal Leakage to general alphabets. These results have broad applications, from bounding the generalization error of learning algorithms to the more general framework of adaptive data analysis, provided that the divergences and/or information measures used are amenable to such an analysis ({\it i.e.,} are robust to post-processing and compose adaptively). The generalization error bounds are derived with respect to high-probability events but a corresponding bound on expected generalization error is also retrieved.