论文标题
使用高斯消除和体积采样的阳性半明确对称矩阵的低秩近似
Low rank approximation of positive semi-definite symmetric matrices using Gaussian elimination and volume sampling
论文作者
论文摘要
阳性半定矩阵通常作为统计中最小二乘问题的正常矩阵或机器学习和近似理论中的内核矩阵出现。它们通常很大且密集。因此,用这种矩阵求解系统的算法可能非常昂贵。降低计算复杂性的核心思想是将矩阵近似一个较低的矩阵。最佳且知名的选择是基于矩阵的特征值分解。不幸的是,这在计算上非常昂贵。 便宜的方法基于高斯消除,但需要枢转。我们将展示不变的矩阵理论如何根据体积采样为平均误差提供明确的误差公式。该公式导致特征值上基本对称多项式的比率。我们讨论了一些新的一个旧边界,并包括几个示例,其中可以准确计算预期的误差规范。
Positive semi-definite matrices commonly occur as normal matrices of least squares problems in statistics or as kernel matrices in machine learning and approximation theory. They are typically large and dense. Thus algorithms to solve systems with such a matrix can be very costly. A core idea to reduce computational complexity is to approximate the matrix by one with a low rank. The optimal and well understood choice is based on the eigenvalue decomposition of the matrix. Unfortunately, this is computationally very expensive. Cheaper methods are based on Gaussian elimination but they require pivoting. We will show how invariant matrix theory provides explicit error formulas for an averaged error based on volume sampling. The formula leads to ratios of elementary symmetric polynomials on the eigenvalues. We discuss some new an old bounds and include several examples where an expected error norm can be computed exactly.