论文标题
文本生成的现代方法
Modern Methods for Text Generation
论文作者
论文摘要
合成文本生成具有挑战性,成功率有限。最近,一种称为“变形金刚”的新体系结构允许机器学习模型了解更好的顺序数据,例如翻译或摘要。 Bert和GPT-2使用核心中的变压器,在文本分类,翻译和NLI任务等任务中表现出色。在本文中,我们分析了算法并比较其在文本生成任务中的输出质量。
Synthetic text generation is challenging and has limited success. Recently, a new architecture, called Transformers, allow machine learning models to understand better sequential data, such as translation or summarization. BERT and GPT-2, using Transformers in their cores, have shown a great performance in tasks such as text classification, translation and NLI tasks. In this article, we analyse both algorithms and compare their output quality in text generation tasks.