论文标题

统计学习的一阶优化算法,具有分层稀疏结构

A first-order optimization algorithm for statistical learning with hierarchical sparsity structure

论文作者

Zhang, Dewei, Liu, Yin, Tajbakhsh, Sam Davanloo

论文摘要

在许多统计学习问题中,希望最佳解决方案符合由有向无环图表示的先验已知的稀疏结构。通过凸正规化器诱导此类结构需要非平滑惩罚函数,以利用群体重叠。我们的研究重点是评估Jacob等人开发的潜在重叠组Lasso的近端操作员。 (2009)。我们通过共享方案实现了乘数的交替方向方法,以有效地解决基础优化问题的大规模实例。在没有强凸度的情况下,使用误差结合理论建立了算法的全局线性收敛。更具体地说,本文有助于建立目标函数中的非平滑分量分量没有多面体铭文时的原始和双重误差界限。我们还研究了图结构对算法收敛速度的影响。提供了支持拟议算法的不同图形结构的详细数值模拟研究,并提供了两种学习应用。

In many statistical learning problems, it is desired that the optimal solution conforms to an a priori known sparsity structure represented by a directed acyclic graph. Inducing such structures by means of convex regularizers requires nonsmooth penalty functions that exploit group overlapping. Our study focuses on evaluating the proximal operator of the Latent Overlapping Group lasso developed by Jacob et al. (2009). We implemented an Alternating Direction Method of Multiplier with a sharing scheme to solve large-scale instances of the underlying optimization problem efficiently. In the absence of strong convexity, global linear convergence of the algorithm is established using the error bound theory. More specifically, the paper contributes to establishing primal and dual error bounds when the nonsmooth component in the objective function does not have a polyhedral epigraph. We also investigate the effect of the graph structure on the speed of convergence of the algorithm. Detailed numerical simulation studies over different graph structures supporting the proposed algorithm and two applications in learning are provided.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源