论文标题

促进层次贝叶斯逆问题的混合求解器的稀疏性

Sparsity promoting hybrid solvers for hierarchical Bayesian inverse problems

论文作者

Calvetti, Daniela, Pragliola, Monica, Somersalo, Erkki

论文摘要

从几个嘈杂的测量值中恢复稀疏生成模型是一个重要且具有挑战性的问题。许多确定性算法依赖于某种形式的$ \ ell_1 $ - $ \ ell_2 $最小化来结合$ \ ell_2 $惩罚的计算便利性和$ \ ell_1 $的稀疏促销。最近在贝叶斯框架内显示,可以通过有条件的高斯先验和伽玛高度统治者来实现稀疏性促进和计算效率。相关的Gibbs能量函数是凸功能,其最小化器是后验的MAP估计值,可以使用全球收敛的迭代迭代顺序(IAS)算法\ cite {css}进行有效计算。对这些稀疏性的高度概括促进层次模型的概括为广义伽马家族产生全球凸吉布斯能量功能,或者可以表现出局部凸度,以便为超参数提供某些选择。 \ cite {cprss}。强烈促进稀疏性的贪婪高度疾病的MAP解决方案的主要问题是局部最小值的存在。为了克服过早停止在一个虚假的局部最小化器中,我们提出了两种混合算法,这些算法首先利用与伽玛高超级学者有关的全球收敛,以到达独特的最小化器的邻里,然后采用广义的伽玛高位,从而更加强烈地促进稀疏性。通过计算的示例说明了两种算法的性能。

The recovery of sparse generative models from few noisy measurements is an important and challenging problem. Many deterministic algorithms rely on some form of $\ell_1$-$\ell_2$ minimization to combine the computational convenience of the $\ell_2$ penalty and the sparsity promotion of the $\ell_1$. It was recently shown within the Bayesian framework that sparsity promotion and computational efficiency can be attained with hierarchical models with conditionally Gaussian priors and gamma hyperpriors. The related Gibbs energy function is a convex functional and its minimizer, which is the MAP estimate of the posterior, can be computed efficiently with the globally convergent Iterated Alternating Sequential (IAS) algorithm \cite{CSS}. Generalization of the hyperpriors for these sparsity promoting hierarchical models to generalized gamma family yield either globally convex Gibbs energy functionals, or can exhibit local convexity for some choices for the hyperparameters. \cite{CPrSS}. The main problem in computing the MAP solution for greedy hyperpriors that strongly promote sparsity is the presence of local minima. To overcome the premature stopping at a spurious local minimizer, we propose two hybrid algorithms that first exploit the global convergence associated with gamma hyperpriors to arrive in a neighborhood of the unique minimizer, then adopt a generalized gamma hyperprior that promote sparsity more strongly. The performance of the two algorithms is illustrated with computed examples.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源