论文标题
模棱两可的流:对称密度的精确可能生成学习
Equivariant Flows: Exact Likelihood Generative Learning for Symmetric Densities
论文作者
论文摘要
归一化流是精确的样品生成神经网络,将样品从简单的先验分布转变为感兴趣的概率分布的样本。最近的工作表明,这种生成模型可以在统计力学中用于物理和化学中多体系统的平衡状态。为了扩展和概括这些结果,至关重要的是,概率密度的自然对称性(在目标电位的不变定义的物理学中)必须内置在流动中。我们提供了一个理论上的足够标准,表明\ textIt {eparivariant}产生的分布相对于设计相对于这些对称性是不变的。此外,我们提出了流量的构建块,以保留通常在物理/化学多体粒子系统中发现的对称性。使用从分子物理学动机的基准系统,我们证明了这些保持对称性的流动可以提供更好的概括能力和采样效率。
Normalizing flows are exact-likelihood generative neural networks which approximately transform samples from a simple prior distribution to samples of the probability distribution of interest. Recent work showed that such generative models can be utilized in statistical mechanics to sample equilibrium states of many-body systems in physics and chemistry. To scale and generalize these results, it is essential that the natural symmetries in the probability density -- in physics defined by the invariances of the target potential -- are built into the flow. We provide a theoretical sufficient criterion showing that the distribution generated by \textit{equivariant} normalizing flows is invariant with respect to these symmetries by design. Furthermore, we propose building blocks for flows which preserve symmetries which are usually found in physical/chemical many-body particle systems. Using benchmark systems motivated from molecular physics, we demonstrate that those symmetry preserving flows can provide better generalization capabilities and sampling efficiency.