论文标题
Cocon:一种自制文本生成的方法
CoCon: A Self-Supervised Approach for Controlled Text Generation
论文作者
论文摘要
预处理的基于变压器的语言模型(LMS)显示出显着的自然语言生成能力。凭借其巨大的潜力,控制这种LM的文本生成正在引起人们的注意。尽管有一些研究试图控制生成的文本的高级属性(例如情感和主题),但仍然缺乏对其在单词和短语级别上的内容的更精确的控制。在这里,我们建议内容调节器(Cocon)以细粒度的水平控制LM的输出文本。在我们的自我监督方法中,Cocon Block学会了通过调节LM中删除的内容输入来帮助LM完成部分观察到的文本序列。通过实验,我们表明Cocon可以自然地将目标内容纳入生成的文本中,并以零拍的方式控制高级文本属性。
Pretrained Transformer-based language models (LMs) display remarkable natural language generation capabilities. With their immense potential, controlling text generation of such LMs is getting attention. While there are studies that seek to control high-level attributes (such as sentiment and topic) of generated text, there is still a lack of more precise control over its content at the word- and phrase-level. Here, we propose Content-Conditioner (CoCon) to control an LM's output text with a content input, at a fine-grained level. In our self-supervised approach, the CoCon block learns to help the LM complete a partially-observed text sequence by conditioning with content inputs that are withheld from the LM. Through experiments, we show that CoCon can naturally incorporate target content into generated texts and control high-level text attributes in a zero-shot manner.