论文标题

得分匹配的神经指数家庭的无可能推理

Score Matched Neural Exponential Families for Likelihood-Free Inference

论文作者

Pacchiardi, Lorenzo, Dutta, Ritabrata

论文摘要

贝叶斯无可能的推理(LFI)方法可以通过依靠模型模拟来获得具有棘手可能性的随机模型的后验分布。在近似贝叶斯计算(ABC)中,一种流行的LFI方法,摘要统计量用于降低数据维度。 ABC算法适应观察结果,以便从近似后部采样,其形式取决于所选的统计数据。在这项工作中,我们介绍了一种学习ABC统计信息的新方法:我们首先在观察过程中独立地从模型中生成参数模拟对;然后,我们使用得分匹配来训练神经条件指数式家庭以近似可能性。指数家族是具有固定尺寸足够统计的最大分布类别;因此,我们在ABC中使用它们,该ABC具有直觉上的吸引力并且具有最先进的性能。同时,我们将可能性近似插入MCMC,以进行双重棘手的分布以绘制后验样品。对于没有其他模型模拟的任何数量的观测值,我们可以重复,具有与相关方法相当的性能。我们验证了具有已知可能性和大维时间序列模型的玩具模型的方法。

Bayesian Likelihood-Free Inference (LFI) approaches allow to obtain posterior distributions for stochastic models with intractable likelihood, by relying on model simulations. In Approximate Bayesian Computation (ABC), a popular LFI method, summary statistics are used to reduce data dimensionality. ABC algorithms adaptively tailor simulations to the observation in order to sample from an approximate posterior, whose form depends on the chosen statistics. In this work, we introduce a new way to learn ABC statistics: we first generate parameter-simulation pairs from the model independently on the observation; then, we use Score Matching to train a neural conditional exponential family to approximate the likelihood. The exponential family is the largest class of distributions with fixed-size sufficient statistics; thus, we use them in ABC, which is intuitively appealing and has state-of-the-art performance. In parallel, we insert our likelihood approximation in an MCMC for doubly intractable distributions to draw posterior samples. We can repeat that for any number of observations with no additional model simulations, with performance comparable to related approaches. We validate our methods on toy models with known likelihood and a large-dimensional time-series model.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源