论文标题
瓦斯坦因和总变化距离的基于接受拒绝的马尔可夫链的收敛速率的下限
Lower bounds on the rate of convergence for accept-reject-based Markov chains in Wasserstein and total variation distances
论文作者
论文摘要
为了避免大都市危机和其他基于接受的算法的经验表现不佳,通常会通过反复试验来调整它们。在总变化和Wasserstein距离之间开发了收敛速率的下限,以确定模拟如何失败,因此可以避免这些设置,从而为调整提供指导。特别关注使用下限来研究基于接受的马尔可夫链的收敛复杂性,并限制了几何奇异的马尔可夫链的收敛速度。该理论应用于多种设置。例如,如果目标密度浓度具有参数n(例如,后浓度,拉普拉斯近似值),则证明,如果调谐参数不仔细依赖于n,则大都市束缚链的收敛速率可以任意放慢。当尺寸和样品一起增加时,Zellner的G-Prior和扁平的贝叶斯逻辑回归n倾向于无穷大时,这证明了这一点是用Zellner的G-Prior进行证明的。
To avoid poor empirical performance in Metropolis-Hastings and other accept-reject-based algorithms practitioners often tune them by trial and error. Lower bounds on the convergence rate are developed in both total variation and Wasserstein distances in order to identify how the simulations will fail so these settings can be avoided, providing guidance on tuning. Particular attention is paid to using the lower bounds to study the convergence complexity of accept-reject-based Markov chains and to constrain the rate of convergence for geometrically ergodic Markov chains. The theory is applied in several settings. For example, if the target density concentrates with a parameter n (e.g. posterior concentration, Laplace approximations), it is demonstrated that the convergence rate of a Metropolis-Hastings chain can be arbitrarily slow if the tuning parameters do not depend carefully on n. This is demonstrated with Bayesian logistic regression with Zellner's g-prior when the dimension and sample increase together and flat prior Bayesian logistic regression as n tends to infinity.