论文标题

在误差绑定条件下的自适应约束凸优化的无参数和无参数重新启动级别设置方法

A Parameter-free and Projection-free Restarting Level Set Method for Adaptive Constrained Convex Optimization Under the Error Bound Condition

论文作者

Lin, Qihang, Ma, Runchao, Nadarajah, Selvaprabu, Soheili, Negar

论文摘要

加速一阶方法的最新努力集中在凸优化问题上,该问题满足了几何特性,称为错误结合条件,该条件涵盖了广泛的问题,包括零件线性程序和强烈凸面程序。采用无预测更新的无参数一阶方法有可能扩大加速的好处。已经开发出用于无约束的凸优化的方法,但缺乏一般约束的凸优化。我们基于无预测的亚级别的体面的体面提出了一种无参数级别的方法,用于后一种限制的情况,该方法表现出满足误差结合条件的问题的加速收敛。我们的方法为每个级别的参数值维护级别子问题的单独副本,并根据目标函数进度重新启动这些副本的计算。在级别的上下文中应用这样的重新启动方案是新颖的,并导致一种动态适应每个副本精度的算法。此属性是扩展基于静态精确度的先前重新启动方法的关键,该方法已提出了用于处理约束的不受约束的凸优化。我们报告了相对于基准方法的有希望的数值性能。

Recent efforts to accelerate first-order methods have focused on convex optimization problems that satisfy a geometric property known as error-bound condition, which covers a broad class of problems, including piece-wise linear programs and strongly convex programs. Parameter-free first-order methods that employ projection-free updates have the potential to broaden the benefit of acceleration. Such a method has been developed for unconstrained convex optimization but is lacking for general constrained convex optimization. We propose a parameter-free level-set method for the latter constrained case based on projection-free subgradient decent that exhibits accelerated convergence for problems that satisfy an error-bound condition. Our method maintains a separate copy of the level-set sub-problem for each level parameter value and restarts the computation of these copies based on objective function progress. Applying such a restarting scheme in a level-set context is novel and results in an algorithm that dynamically adapts the precision of each copy. This property is key to extending prior restarting methods based on static precision that have been proposed for unconstrained convex optimization to handle constraints. We report promising numerical performance relative to benchmark methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源