论文标题
一步估计,具有缩放的近端方法
One-Step Estimation With Scaled Proximal Methods
论文作者
论文摘要
我们研究了使用迭代优化方法计算的统计估计器,直到完成。最大似然估计器(MLE)的经典结果断言,一个步骤估计量(OSE),其中从具有某些属性的起点进行单个Newton-Raphson迭代,在某些属性上进行了渐变等同于MLE。我们通过得出由缩放近端方法的单个迭代定义的一步估计器的性能进一步发展这些早期的结果。我们的主要结果表明,基于近端方法定义的基于似然估计器和各种单步估计器的渐近等效性。通过将OSS解释为一系列迭代序列的最后一个,我们的结果提供了对样本量缩放数值耐受性的见解。我们的设置包含适用于某些综合模型的缩放近端梯度下降,使我们的结果适用于许多实际兴趣的问题。此外,我们的结果通过将缩放的近端下降解释为应用于缩放的Moreau Invelope的准Newton方法,为缩放的Moreau Invelope的实用性提供了支持。
We study statistical estimators computed using iterative optimization methods that are not run until completion. Classical results on maximum likelihood estimators (MLEs) assert that a one-step estimator (OSE), in which a single Newton-Raphson iteration is performed from a starting point with certain properties, is asymptotically equivalent to the MLE. We further develop these early-stopping results by deriving properties of one-step estimators defined by a single iteration of scaled proximal methods. Our main results show the asymptotic equivalence of the likelihood-based estimator and various one-step estimators defined by scaled proximal methods. By interpreting OSEs as the last of a sequence of iterates, our results provide insight on scaling numerical tolerance with sample size. Our setting contains scaled proximal gradient descent applied to certain composite models as a special case, making our results applicable to many problems of practical interest. Additionally, our results provide support for the utility of the scaled Moreau envelope as a statistical smoother by interpreting scaled proximal descent as a quasi-Newton method applied to the scaled Moreau envelope.