论文标题

有针对性的分离和与内核差异的收敛性

Targeted Separation and Convergence with Kernel Discrepancies

论文作者

Barp, Alessandro, Simon-Gabriel, Carl-Johann, Girolami, Mark, Mackey, Lester

论文摘要

最大平均差异(MMD)(例如内核Stein差异(KSD))已成为广泛应用的中心,包括假设测试,采样器选择,分布近似和变异推断。在每种情况下,这些基于内核的差异度量都需要(i)(i)将目标p与其他概率度量分开,甚至(ii)控制弱收敛到P。在本文中,我们得出了新的足够和必要的条件,以确保(i)和(ii)。对于可分开的度量空间上的MMD,我们表征了那些将Bochner嵌入措施分开的内核,并引入了简单的条件,以将所有措施与无界内核分开并控制与有界内核的收敛性。我们在$ \ Mathbb {r}^d $上使用这些结果来实质性地扩大了KSD分离和收敛控制的已知条件,并开发了已知的第一个KSD,该ksds已已知的第一个KSD,旨在准确地将弱收敛到P。在此过程中,我们强调了结果对假设测试,测量和改善样品质量,和改善stein variational deSmplientiality Descententigent spectients的结果的含义。

Maximum mean discrepancies (MMDs) like the kernel Stein discrepancy (KSD) have grown central to a wide range of applications, including hypothesis testing, sampler selection, distribution approximation, and variational inference. In each setting, these kernel-based discrepancy measures are required to (i) separate a target P from other probability measures or even (ii) control weak convergence to P. In this article we derive new sufficient and necessary conditions to ensure (i) and (ii). For MMDs on separable metric spaces, we characterize those kernels that separate Bochner embeddable measures and introduce simple conditions for separating all measures with unbounded kernels and for controlling convergence with bounded kernels. We use these results on $\mathbb{R}^d$ to substantially broaden the known conditions for KSD separation and convergence control and to develop the first KSDs known to exactly metrize weak convergence to P. Along the way, we highlight the implications of our results for hypothesis testing, measuring and improving sample quality, and sampling with Stein variational gradient descent.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源