论文标题
关于近端梯度映射及其通过基于潜在函数的加速度在规范中的最小化
On proximal gradient mapping and its minimization in norm via potential function-based acceleration
论文作者
论文摘要
近端梯度下降法,众所周知的复合优化,可以通过近端梯度映射的概念完全描述。在本文中,我们强调了我们之前的两个发现近端梯度映射的发现 - 单调性和精制下降,我们能够将最近提出的基于潜在功能的框架从梯度下降到近端梯度下降扩展。
The proximal gradient descent method, well-known for composite optimization, can be completely described by the concept of proximal gradient mapping. In this paper, we highlight our previous two discoveries of proximal gradient mapping--norm monotonicity and refined descent, with which we are able to extend the recently proposed potential function-based framework from gradient descent to proximal gradient descent.