论文标题

方差的隐式正则化特性降低了随机镜下降

Implicit Regularization Properties of Variance Reduced Stochastic Mirror Descent

论文作者

Luo, Yiling, Huo, Xiaoming, Mei, Yajun

论文摘要

在机器学习和统计数据分析中,我们经常遇到目标函数,这是一个总结:求和中的术语数可能等于样本量,这可能是巨大的。在这种情况下,随机镜下降(SMD)算法是一种数值高效的方法 - 每个迭代涉及数据的一小部分。 SMD(VRSMD)的差异版本可以通过诱导更快的收敛来进一步改善SMD。另一方面,诸如梯度下降和随机梯度下降之类的算法具有隐式正则化属性,从而导致概括误差的性能更好。对于此类属性是否具有VRSMD,知之甚少。我们在这里证明,离散的VRSMD估计器序列在线性回归中收敛到最小镜面插值。这建立了VRSMD的隐式正则属性。作为上述结果的应用,我们在真实模型稀疏时得出了设置模型估计精度。我们使用数值示例来说明VRSMD的经验能力。

In machine learning and statistical data analysis, we often run into objective function that is a summation: the number of terms in the summation possibly is equal to the sample size, which can be enormous. In such a setting, the stochastic mirror descent (SMD) algorithm is a numerically efficient method -- each iteration involving a very small subset of the data. The variance reduction version of SMD (VRSMD) can further improve SMD by inducing faster convergence. On the other hand, algorithms such as gradient descent and stochastic gradient descent have the implicit regularization property that leads to better performance in terms of the generalization errors. Little is known on whether such a property holds for VRSMD. We prove here that the discrete VRSMD estimator sequence converges to the minimum mirror interpolant in the linear regression. This establishes the implicit regularization property for VRSMD. As an application of the above result, we derive a model estimation accuracy result in the setting when the true model is sparse. We use numerical examples to illustrate the empirical power of VRSMD.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源