论文标题
随机优化,通过加速梯度剪辑和重尾噪声
Stochastic Optimization with Heavy-Tailed Noise via Accelerated Gradient Clipping
论文作者
论文摘要
在本文中,我们提出了一种新的加速随机一阶方法,称为夹层SSTM,用于平滑凸的随机优化,并在随机梯度中使用重尾分布式噪声,并得出了这种方法的第一个高概率复杂性界限,从而缩小了与重尾噪声理论中差距的差距。我们的方法基于一种特殊的加速随机梯度下降(SGD)和随机梯度的剪接。我们将我们的方法扩展到强烈凸的情况,并证明了新的复杂性范围,在这种情况下,最先进的范围会导致最先进的情况。最后,我们扩展了证明技术,并在噪声上剪裁没有轻尾假设的SGD的第一个非平凡高概率复杂性界限。
In this paper, we propose a new accelerated stochastic first-order method called clipped-SSTM for smooth convex stochastic optimization with heavy-tailed distributed noise in stochastic gradients and derive the first high-probability complexity bounds for this method closing the gap in the theory of stochastic optimization with heavy-tailed noise. Our method is based on a special variant of accelerated Stochastic Gradient Descent (SGD) and clipping of stochastic gradients. We extend our method to the strongly convex case and prove new complexity bounds that outperform state-of-the-art results in this case. Finally, we extend our proof technique and derive the first non-trivial high-probability complexity bounds for SGD with clipping without light-tails assumption on the noise.