论文标题
在当地条件下,非凸出的蒙蒂尼亚蒙特卡洛的非肿瘤分析以进行非凸优化
Nonasymptotic analysis of Stochastic Gradient Hamiltonian Monte Carlo under local conditions for nonconvex optimization
论文作者
论文摘要
我们对随机梯度哈密顿蒙特卡洛(SGHMC)的收敛性提供了非扰动分析,以在wasserstein-2距离内的目标度量,而无需假设对数concoveity。我们的分析量化了SGHMC作为采样器在当地条件下的关键理论特性,从而显着改善了先前结果的发现。特别是,我们证明了SGHMC的目标与定律之间的Wasserstein-2距离由算法的尺寸统一控制,因此证明SGHMC可以在迭代次数中均匀地提供高精度的结果。该分析还允许我们在当地条件下获得非convex优化问题的非矩形界限,这意味着当将SGHMC视为非convex优化器时,将其收敛到全球最小值,并以最著名的速率收敛。我们将结果应用于可伸缩的贝叶斯推断和非反应泛化界限的非反应界限。
We provide a nonasymptotic analysis of the convergence of the stochastic gradient Hamiltonian Monte Carlo (SGHMC) to a target measure in Wasserstein-2 distance without assuming log-concavity. Our analysis quantifies key theoretical properties of the SGHMC as a sampler under local conditions which significantly improves the findings of previous results. In particular, we prove that the Wasserstein-2 distance between the target and the law of the SGHMC is uniformly controlled by the step-size of the algorithm, therefore demonstrate that the SGHMC can provide high-precision results uniformly in the number of iterations. The analysis also allows us to obtain nonasymptotic bounds for nonconvex optimization problems under local conditions and implies that the SGHMC, when viewed as a nonconvex optimizer, converges to a global minimum with the best known rates. We apply our results to obtain nonasymptotic bounds for scalable Bayesian inference and nonasymptotic generalization bounds.