论文标题

知识提示:通过软提示将世界知识注入语言模型

Knowledge Prompts: Injecting World Knowledge into Language Models through Soft Prompts

论文作者

Santos, Cicero Nogueira dos, Dong, Zhe, Cer, Daniel, Nham, John, Shakeri, Siamak, Ni, Jianmo, Sung, Yun-hsuan

论文摘要

最近提出了软提示作为将大型冷冻语言模型(LMS)调整为新任务的工具。在这项工作中,我们将软提示重新提示将世界知识注入LMS的任务。我们介绍了一种方法,通过对知识库的数据进行自我监督的学习来训练软提示。由此产生的软知识提示(KPS)独立于任务,并且可以作为LMS的外部内存。我们执行定性和定量实验,并证明:(1)KP可以有效地对训练数据的结构进行建模; (2)KP可用于提高LMS在不同知识密集型任务中的性能。

Soft prompts have been recently proposed as a tool for adapting large frozen language models (LMs) to new tasks. In this work, we repurpose soft prompts to the task of injecting world knowledge into LMs. We introduce a method to train soft prompts via self-supervised learning on data from knowledge bases. The resulting soft knowledge prompts (KPs) are task independent and work as an external memory of the LMs. We perform qualitative and quantitative experiments and demonstrate that: (1) KPs can effectively model the structure of the training data; (2) KPs can be used to improve the performance of LMs in different knowledge intensive tasks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源