论文标题
数据空间内内核距离的生成模型
Generative models with kernel distance in data space
论文作者
论文摘要
与联合数据分布进行建模的生成模型通常是基于自动编码器或基于GAN的。两者都有其优点和缺点,产生模糊的图像或在训练中不稳定或易于模式崩溃现象。本文的目的是构建位于上述体系结构之间的模型,该模型不能继承其主要弱点。提出的LCW发电机(潜在的cramer wold Generator)类似于将高斯噪声转换为数据空间的经典gan。 LCW Generator使用内核距离,而不是〜INICTIMINATOR,而不是〜INIDIMINATOR。没有使用对抗性训练,因此是生成器的名称。它经过两个阶段的训练。首先,使用内核度量的基于自动编码器的体系结构是为了建模数据的模型而构建的。我们提出了一个潜在的技巧将高斯映射到潜在的,以获取最终模型。这会导致非常有竞争力的FID值。
Generative models dealing with modeling a~joint data distribution are generally either autoencoder or GAN based. Both have their pros and cons, generating blurry images or being unstable in training or prone to mode collapse phenomenon, respectively. The objective of this paper is to construct a~model situated between above architectures, one that does not inherit their main weaknesses. The proposed LCW generator (Latent Cramer-Wold generator) resembles a classical GAN in transforming Gaussian noise into data space. What is of utmost importance, instead of a~discriminator, LCW generator uses kernel distance. No adversarial training is utilized, hence the name generator. It is trained in two phases. First, an autoencoder based architecture, using kernel measures, is built to model a manifold of data. We propose a Latent Trick mapping a Gaussian to latent in order to get the final model. This results in very competitive FID values.