论文标题
在非convex优化中无需替换抽样的增量
Incremental Without Replacement Sampling in Nonconvex Optimization
论文作者
论文摘要
通常在随机近似设置中分析了用于经验风险最小化的Minibatch分解方法,也称为替代品的采样。另一方面,这种技术的现代实现是渐进的:它们依靠采样而无需更换,因此可用的分析非常稀少。我们通过分析多功能增量梯度方案为后一种变体提供收敛保证。对于此方案,我们考虑恒定,减小或自适应步长大小。在平滑的环境中,我们根据时期计数器获得明确的复杂性估计。在非平滑环境中,我们证明了该序列是由问题的最佳条件解决方案所吸引的。
Minibatch decomposition methods for empirical risk minimization are commonly analysed in a stochastic approximation setting, also known as sampling with replacement. On the other hands modern implementations of such techniques are incremental: they rely on sampling without replacement, for which available analysis are much scarcer. We provide convergence guaranties for the latter variant by analysing a versatile incremental gradient scheme. For this scheme, we consider constant, decreasing or adaptive step sizes. In the smooth setting we obtain explicit complexity estimates in terms of epoch counter. In the nonsmooth setting we prove that the sequence is attracted by solutions of optimality conditions of the problem.