论文标题

通过TimeStepsepprization改善用于文本建模的变异自动编码器

Improving Variational Autoencoder for Text Modelling with Timestep-Wise Regularisation

论文作者

Li, Ruizhe, Li, Xiao, Chen, Guanyi, Lin, Chenghua

论文摘要

变性自动编码器(VAE)是一种流行而有力的模型,用于文本建模,以生成各种句子。但是,当VAE在文本建模中使用vae时,就会发生一个称为后置崩溃(或KL损失消失)的问题,在文本建模中,近似后部倒塌到先验,并且该模型将完全忽略潜在变量并在文本生成过程中降低到普通语言模型。当采用基于RNN的VAE模型进行文本建模时,这种问题尤其普遍。在本文中,我们提出了一种简单的通用体系结构,称为TimEstep-Wise正则化VAE(TWR-VAE),该结构可以有效地避免后倒塌,并可以应用于任何基于RNN的VAE模型。我们的模型的有效性和多功能性在不同的任务中得到了证明,包括语言建模和对话响应的产生。

The Variational Autoencoder (VAE) is a popular and powerful model applied to text modelling to generate diverse sentences. However, an issue known as posterior collapse (or KL loss vanishing) happens when the VAE is used in text modelling, where the approximate posterior collapses to the prior, and the model will totally ignore the latent variables and be degraded to a plain language model during text generation. Such an issue is particularly prevalent when RNN-based VAE models are employed for text modelling. In this paper, we propose a simple, generic architecture called Timestep-Wise Regularisation VAE (TWR-VAE), which can effectively avoid posterior collapse and can be applied to any RNN-based VAE models. The effectiveness and versatility of our model are demonstrated in different tasks, including language modelling and dialogue response generation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源