论文标题
带有熵正则化的批量标准将确定性自动编码器变成生成模型
Batch norm with entropic regularization turns deterministic autoencoders into generative models
论文作者
论文摘要
变性自动编码器是一个定义明确的深层生成模型,它利用编码器框架,其中编码神经网络输出一个非确定性代码以重建输入。编码器通过从每个输入的分布中取样来实现这一目标,而不是每个输入输出确定性代码。此过程的最大优点是,它允许将网络用作从提供训练样本的数据分布中进行采样的生成模型。我们在这项工作中表明,只要我们在训练目标中添加了合适的熵正则化,就将批准化作为非确定性的足以将确定性的自动编码器变成与变异性模型相等的生成模型的来源。
The variational autoencoder is a well defined deep generative model that utilizes an encoder-decoder framework where an encoding neural network outputs a non-deterministic code for reconstructing an input. The encoder achieves this by sampling from a distribution for every input, instead of outputting a deterministic code per input. The great advantage of this process is that it allows the use of the network as a generative model for sampling from the data distribution beyond provided samples for training. We show in this work that utilizing batch normalization as a source for non-determinism suffices to turn deterministic autoencoders into generative models on par with variational ones, so long as we add a suitable entropic regularization to the training objective.