论文标题

近端迭代重新加权$ \ ell_1 $方法的收敛率分析$ \ ell_p $正则化问题

Convergence Rate Analysis of Proximal Iteratively Reweighted $\ell_1$ Methods for $\ell_p$ Regularization Problems

论文作者

Wang, Hao, Zeng, Hao, Wang, Jiashan

论文摘要

在本文中,我们专注于用于求解$ \ ell_p $正则化问题的近端迭代重新持续$ \ ell_1 $算法的本地收敛率分析,该算法被广泛应用于诱导稀疏解决方案。我们表明,如果满足了Kurdyka-lojasiewicz(KL)属性,则算法将收敛到唯一的一阶固定点;此外,该算法具有局部线性收敛或局部sublinear收敛。我们得出的理论结果要比现有的迭代重新加权$ \ ell_1 $算法要强得多。

In this paper, we focus on the local convergence rate analysis of the proximal iteratively reweighted $\ell_1$ algorithms for solving $\ell_p$ regularization problems, which are widely applied for inducing sparse solutions. We show that if the Kurdyka-Lojasiewicz (KL) property is satisfied, the algorithm converges to a unique first-order stationary point; furthermore, the algorithm has local linear convergence or local sublinear convergence. The theoretical results we derived are much stronger than the existing results for iteratively reweighted $\ell_1$ algorithms.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源