论文标题
CKG:基于上下文和知识图的动态表示
CKG: Dynamic Representation Based on Context and Knowledge Graph
论文作者
论文摘要
最近,在大型语料库中预先训练的神经语言表示模型可以捕获丰富的共同出现信息,并在下游任务中进行微调以提高性能。结果,他们实现了最新的语言任务。但是,还有其他有价值的语义信息,例如外部知识图(kgs)中的相似,相反或其他可能的含义。我们认为,可以使用公斤中的实体来增强语言句子的正确语义含义。在本文中,我们提出了一种基于\ textbf {c} onText和\ textbf {k} nowledge \ textbf {g} raph的新方法CKG:动态表示。一方面,CKG可以提取大型语料库的丰富语义信息。另一方面,它可以充分利用内部信息,例如大型语料库中的共发生和外部信息,例如KGS中的类似实体。我们对包括QQP,MRPC,SST-5,Squad,Conll 2003和SNLI在内的各种任务进行了广泛的实验。实验结果表明,与SAN(84.4),Elmo(85.8)和Bert $ _ {base} $(88.5)相比,CKG在小队上达到SOTA 89.2。
Recently, neural language representation models pre-trained on large corpus can capture rich co-occurrence information and be fine-tuned in downstream tasks to improve the performance. As a result, they have achieved state-of-the-art results in a large range of language tasks. However, there exists other valuable semantic information such as similar, opposite, or other possible meanings in external knowledge graphs (KGs). We argue that entities in KGs could be used to enhance the correct semantic meaning of language sentences. In this paper, we propose a new method CKG: Dynamic Representation Based on \textbf{C}ontext and \textbf{K}nowledge \textbf{G}raph. On the one side, CKG can extract rich semantic information of large corpus. On the other side, it can make full use of inside information such as co-occurrence in large corpus and outside information such as similar entities in KGs. We conduct extensive experiments on a wide range of tasks, including QQP, MRPC, SST-5, SQuAD, CoNLL 2003, and SNLI. The experiment results show that CKG achieves SOTA 89.2 on SQuAD compared with SAN (84.4), ELMo (85.8), and BERT$_{Base}$ (88.5).