论文标题
通过SVD和边缘化快速线性回归
A Fast Linear Regression via SVD and Marginalization
论文作者
论文摘要
我们描述了一种数值方案,用于评估使用系数部分合并的贝叶斯线性回归模型的后矩。评估的主要分析工具是从系数空间变为预测变量矩阵奇异向量的空间的基础变化。在这种基础变化和分析整合之后,我们减少了在K + M尺寸上找到密度的矩的问题,以找到M维密度的矩,其中K是系数的数量,K + M是后验的维度。然后可以使用例如MCMC,梯形规则或自适应高斯正交计算矩。对预测变量矩阵的SVD的评估是主要的计算成本,并且在预典礼阶段进行一次。我们演示了该算法的数值结果。本文描述的方案自然而然地将出现正常参数的多级和多组分层回归模型。
We describe a numerical scheme for evaluating the posterior moments of Bayesian linear regression models with partial pooling of the coefficients. The principal analytical tool of the evaluation is a change of basis from coefficient space to the space of singular vectors of the matrix of predictors. After this change of basis and an analytical integration, we reduce the problem of finding moments of a density over k + m dimensions, to finding moments of an m-dimensional density, where k is the number of coefficients and k + m is the dimension of the posterior. Moments can then be computed using, for example, MCMC, the trapezoid rule, or adaptive Gaussian quadrature. An evaluation of the SVD of the matrix of predictors is the dominant computational cost and is performed once during the precomputation stage. We demonstrate numerical results of the algorithm. The scheme described in this paper generalizes naturally to multilevel and multi-group hierarchical regression models where normal-normal parameters appear.