论文标题

$ \ text {l} _2 \ text {e} $回归的更清晰的计算工具

A Sharper Computational Tool for $\text{L}_2\text{E}$ Regression

论文作者

Liu, Xiaoqian, Chi, Eric C., Lange, Kenneth

论文摘要

基于Chi和Chi的先前研究(2022),当前的论文在$ \ text {l} _2 \ text {e} $标准下重新审视了稳健结构回归的估计。我们采用大型最小化(MM)原理来设计一种用于更新回归系数向量的新算法。与以前的交替近端梯度下降算法相比,我们的敏锐大型化的收敛速度更快(Chi and Chi,2022)。此外,我们通过修改后的牛顿的方法将精度替换为比例并估算精度来重新聚集模型。这简化并加速了总体估计。我们还介绍了距离对集合的惩罚,以允许在非convex约束集中限制估计。这种策略还提高了系数估计和结构恢复的性能。最后,我们通过丰富的模拟示例和真实的数据应用程序来证明我们改进的策略的优点。

Building on previous research of Chi and Chi (2022), the current paper revisits estimation in robust structured regression under the $\text{L}_2\text{E}$ criterion. We adopt the majorization-minimization (MM) principle to design a new algorithm for updating the vector of regression coefficients. Our sharp majorization achieves faster convergence than the previous alternating proximal gradient descent algorithm (Chi and Chi, 2022). In addition, we reparameterize the model by substituting precision for scale and estimate precision via a modified Newton's method. This simplifies and accelerates overall estimation. We also introduce distance-to-set penalties to allow constrained estimation under nonconvex constraint sets. This tactic also improves performance in coefficient estimation and structure recovery. Finally, we demonstrate the merits of our improved tactics through a rich set of simulation examples and a real data application.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源