论文标题

生成模型的指数倾斜:通过训练和潜在能量提高样品质量

Exponential Tilting of Generative Models: Improving Sample Quality by Training and Sampling from Latent Energy

论文作者

Xiao, Zhisheng, Yan, Qing, Amit, Yali

论文摘要

在本文中,我们提出了一种可以提高基于预训练的可能生成模型的样本质量的通用方法。我们的方法在潜在可变空间上构建了能量函数,该函数在预训练的生成模型产生的样品上产生能量函数。基于能量的模型是通过最大化数据可能性来有效训练的,在训练后,潜在空间中的新样品是从基于能量的模型生成的,并通过发电机传递到观察空间中的样品。我们表明,使用我们提出的方法,我们可以大大提高基于流行似然的生成模型的样本质量,例如标准化流量和VAE,而计算开销很少。

In this paper, we present a general method that can improve the sample quality of pre-trained likelihood based generative models. Our method constructs an energy function on the latent variable space that yields an energy function on samples produced by the pre-trained generative model. The energy based model is efficiently trained by maximizing the data likelihood, and after training, new samples in the latent space are generated from the energy based model and passed through the generator to producing samples in observation space. We show that using our proposed method, we can greatly improve the sample quality of popular likelihood based generative models, such as normalizing flows and VAEs, with very little computational overhead.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源