论文标题

随机快速子空间下降方法

Randomized Fast Subspace Descent Methods

论文作者

Chen, Long, Hu, Xiaozhe, Wu, Huiwen

论文摘要

开发并分析了随机快速子空间下降(RFASD)方法,以解决平滑和非构造的凸优化问题。该方法的效率依赖于$ a $ norm稳定的空间分解,与此同时,以$ a $ norm的形式测量的条件号$κ_A$很小。在每次迭代中,子空间是随机选择的,或通过与局部Lipschitz常数成正比的概率进行选择。然后,在每个选择的子空间中,应用了预处理的梯度下降法。 RFASD会收敛于凸函数,并线性地汇聚,以实现强凸功能。与随机块坐标下降方法相比,如果$κ_A$很小,并且子空间分解为$ a $ a-稳定,则RFASD的收敛速度更快。考虑到Nesterov的“最坏”问题的多级空间分解,可以支持这种改进。

Randomized Fast Subspace Descent (RFASD) Methods are developed and analyzed for smooth and non-constraint convex optimization problems. The efficiency of the method relies on a space decomposition which is stable in $A$-norm, and meanwhile, the condition number $κ_A$ measured in $A$-norm is small. At each iteration, the subspace is chosen randomly either uniformly or by a probability proportional to the local Lipschitz constants. Then in each chosen subspace, a preconditioned gradient descent method is applied. RFASD converges sublinearly for convex functions and linearly for strongly convex functions. Comparing with the randomized block coordinate descent methods, the convergence of RFASD is faster provided $κ_A$ is small and the subspace decomposition is $A$-stable. This improvement is supported by considering a multilevel space decomposition for Nesterov's `worst' problem.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源