论文标题

加速随机优化的方差正则化

Variance Regularization for Accelerating Stochastic Optimization

论文作者

Yang, Tong, Sha, Long, Hong, Pengyu

论文摘要

虽然如今大多数基于梯度的优化方法着重于探索高维几何特征,但尚未强调任何算法实现的随机版本中累积的随机误差。在这项工作中,我们提出了一个通用原理,该原理通过利用隐藏在迷你批次梯度中的统计信息来减少随机误差的积累。这是通过根据迷你批次方差正规化学习率来实现的。由于我们观点的互补性,这种正则化可以为通用一阶方法的随机实施提供进一步的改进。通过经验结果,我们证明了方差正则化可以加快收敛性并稳定随机优化。

While nowadays most gradient-based optimization methods focus on exploring the high-dimensional geometric features, the random error accumulated in a stochastic version of any algorithm implementation has not been stressed yet. In this work, we propose a universal principle which reduces the random error accumulation by exploiting statistic information hidden in mini-batch gradients. This is achieved by regularizing the learning-rate according to mini-batch variances. Due to the complementarity of our perspective, this regularization could provide a further improvement for stochastic implementation of generic 1st order approaches. With empirical results, we demonstrated the variance regularization could speed up the convergence as well as stabilize the stochastic optimization.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源