论文标题
估计被动设计下的最小化和回归函数的最小值
Estimating the minimizer and the minimum value of a regression function under passive design
论文作者
论文摘要
我们提出了一种新方法,用于估计最小化器$ \ boldsymbol {x}^*$以及从随机噪声污染的观测值中平滑且强烈凸回归功能的最小值$ f^*$。我们的估计器$ \ boldsymbol {z} _n $的最小化$ \ boldsymbol {x}^*$基于预计梯度下降的版本,其梯度由正规的本地多项式算法估计。接下来,我们提出了一个两阶段的程序,以估算回归功能的最小值$ f^*$ $ f $。在第一阶段,我们构建了$ \ boldsymbol {x}^*$的足够精确的估计器,例如$ \ boldsymbol {z} _n $。在第二阶段,我们使用速率最佳的非参数过程在第一阶段获得的点估计功能值。我们为二次风险和$ \ boldsymbol {z} _n $的二次风险和优化误差而得出非反应上限,以及估计$ f^*$的风险。我们建立了最小值的下限,表明在某些参数选择下,所提出的算法达到了平滑且强烈凸功能的最小收敛速率。
We propose a new method for estimating the minimizer $\boldsymbol{x}^*$ and the minimum value $f^*$ of a smooth and strongly convex regression function $f$ from the observations contaminated by random noise. Our estimator $\boldsymbol{z}_n$ of the minimizer $\boldsymbol{x}^*$ is based on a version of the projected gradient descent with the gradient estimated by a regularized local polynomial algorithm. Next, we propose a two-stage procedure for estimation of the minimum value $f^*$ of regression function $f$. At the first stage, we construct an accurate enough estimator of $\boldsymbol{x}^*$, which can be, for example, $\boldsymbol{z}_n$. At the second stage, we estimate the function value at the point obtained in the first stage using a rate optimal nonparametric procedure. We derive non-asymptotic upper bounds for the quadratic risk and optimization error of $\boldsymbol{z}_n$, and for the risk of estimating $f^*$. We establish minimax lower bounds showing that, under certain choice of parameters, the proposed algorithms achieve the minimax optimal rates of convergence on the class of smooth and strongly convex functions.