论文标题
MT5:大量多语言的预训练的文本到文本变压器
mT5: A massively multilingual pre-trained text-to-text transformer
论文作者
论文摘要
最近的“文本到文本传输变压器”(T5)利用了统一的文本对文本格式和比例,以在各种英语NLP任务上获得最新的结果。在本文中,我们介绍了MT5,这是T5的多语言变体,该变体已在涵盖101种语言的新的基于爬网的新数据集中进行了预训练。我们详细介绍了MT5的设计和修改培训,并在许多多语言基准上展示了其最先进的性能。我们还描述了一种简单的技术,可以防止零弹性设置中的“意外翻译”,其中生成模型选择(部分)将其预测转化为错误的语言。这项工作中使用的所有代码和模型检查点均可公开使用。
The recent "Text-to-Text Transfer Transformer" (T5) leveraged a unified text-to-text format and scale to attain state-of-the-art results on a wide variety of English-language NLP tasks. In this paper, we introduce mT5, a multilingual variant of T5 that was pre-trained on a new Common Crawl-based dataset covering 101 languages. We detail the design and modified training of mT5 and demonstrate its state-of-the-art performance on many multilingual benchmarks. We also describe a simple technique to prevent "accidental translation" in the zero-shot setting, where a generative model chooses to (partially) translate its prediction into the wrong language. All of the code and model checkpoints used in this work are publicly available.