论文标题

坡度的坐标下降

Coordinate Descent for SLOPE

论文作者

Larsson, Johan, Klopfenstein, Quentin, Massias, Mathurin, Wallin, Jonas

论文摘要

套索是最著名的稀疏回归和特征选择方法。其受欢迎程度的原因之一是可以解决基本优化问题的速度。排序的L-One惩罚估计(SLOPE)是对具有吸引力的统计特性的Lasso的概括。尽管如此,该方法尚未引起广泛的兴趣。这样做的主要原因是,当前适合坡度的软件包依赖于在高维度下性能较差的算法。为了解决此问题,我们提出了一种新的快速算法来解决斜率优化问题,该问题结合了近端梯度下降和近端坐标下降步骤。我们提供有关斜率罚款及其相关斜率阈值操作员的定向导数的新结果,并为我们建议的求解器提供收敛保证。在模拟和真实数据的广泛基准中,我们表明我们的方法的表现优于一长串竞争算法。

The lasso is the most famous sparse regression and feature selection method. One reason for its popularity is the speed at which the underlying optimization problem can be solved. Sorted L-One Penalized Estimation (SLOPE) is a generalization of the lasso with appealing statistical properties. In spite of this, the method has not yet reached widespread interest. A major reason for this is that current software packages that fit SLOPE rely on algorithms that perform poorly in high dimensions. To tackle this issue, we propose a new fast algorithm to solve the SLOPE optimization problem, which combines proximal gradient descent and proximal coordinate descent steps. We provide new results on the directional derivative of the SLOPE penalty and its related SLOPE thresholding operator, as well as provide convergence guarantees for our proposed solver. In extensive benchmarks on simulated and real data, we show that our method outperforms a long list of competing algorithms.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源