论文标题

随机坐标下降 - 兰格文蒙特卡洛的差异降低

Variance reduction for Random Coordinate Descent-Langevin Monte Carlo

论文作者

Ding, Zhiyan, Li, Qin

论文摘要

来自对数符号分布功能的采样是一个核心问题,它在贝叶斯统计和机器学习中具有广泛的应用。尽管大多数无梯度的方法的收敛速率较慢,但提供快速收敛的Langevin Monte Carlo(LMC)需要计算梯度。实际上,人们将有限差异近似值用作替代物,并且该方法在高维度中很昂贵。 降低每次迭代中计算成本的一种自然策略是利用随机梯度近似,例如随机坐标下降(RCD)或同时扰动随机近似(SPSA)。我们通过反示例表明,盲目应用RCD并不能在最一般的环境中实现目标。随机性引起的高方差意味着需要大量的迭代,这可以平衡每次迭代的节省。 然后,我们引入了一种新的差异方法,称为随机坐标平均下降(RCAD),并将其与过度阻尼和阻尼不足的LMC结合在一起。该方法分别称为RCAD-O-LMC和RCAD-U-LMC。这些方法仍然位于随机梯度近似框架中,因此每次迭代中的计算成本都很低。但是,通过采用RCAD,差异降低,因此该方法在与经典失律和阻尼不足的LMC相同数量的迭代次数内收敛。这总体上可以节省计算。

Sampling from a log-concave distribution function is one core problem that has wide applications in Bayesian statistics and machine learning. While most gradient free methods have slow convergence rate, the Langevin Monte Carlo (LMC) that provides fast convergence requires the computation of gradients. In practice one uses finite-differencing approximations as surrogates, and the method is expensive in high-dimensions. A natural strategy to reduce computational cost in each iteration is to utilize random gradient approximations, such as random coordinate descent (RCD) or simultaneous perturbation stochastic approximation (SPSA). We show by a counter-example that blindly applying RCD does not achieve the goal in the most general setting. The high variance induced by the randomness means a larger number of iterations are needed, and this balances out the saving in each iteration. We then introduce a new variance reduction approach, termed Randomized Coordinates Averaging Descent (RCAD), and incorporate it with both overdamped and underdamped LMC. The methods are termed RCAD-O-LMC and RCAD-U-LMC respectively. The methods still sit in the random gradient approximation framework, and thus the computational cost in each iteration is low. However, by employing RCAD, the variance is reduced, so the methods converge within the same number of iterations as the classical overdamped and underdamped LMC. This leads to a computational saving overall.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源