论文标题

开放世界知识图中实体的分布式表示

Distributed Representations of Entities in Open-World Knowledge Graphs

论文作者

Guo, Lingbing, Chen, Zhuo, Chen, Jiaoyan, Zhang, Yichi, Sun, Zequn, Bo, Zhongpo, Fang, Yin, Liu, Xiaoze, Chen, Huajun, Zhang, Wen

论文摘要

基于图形神经网络(GNN)的方法在各种知识图(kg)任务中表现出了出色的性能。但是,大多数现有的方法都依赖于在培训期间观察所有实体,这在新实体经常出现的现实世界知识图中提出了挑战。为了解决此限制,我们介绍了分散的注意网络(DAN)。 Dan利用邻居上下文作为查询向量来对一个实体的邻居进行评分,从而仅在其邻居嵌入中分发实体语义。为了有效地训练DAN,我们引入了自我介绍,该技术可以指导网络生成所需的表示形式。理论分析验证了我们方法的有效性。我们实施端到端框架,并进行广泛的实验来评估我们的方法,并在常规实体一致性和实体预测任务上展示了竞争性能。此外,我们的方法在开放世界设置中大大优于现有方法。

Graph neural network (GNN)-based methods have demonstrated remarkable performance in various knowledge graph (KG) tasks. However, most existing approaches rely on observing all entities during training, posing a challenge in real-world knowledge graphs where new entities emerge frequently. To address this limitation, we introduce Decentralized Attention Network (DAN). DAN leverages neighbor context as the query vector to score the neighbors of an entity, thereby distributing the entity semantics only among its neighbor embeddings. To effectively train a DAN, we introduce self-distillation, a technique that guides the network in generating desired representations. Theoretical analysis validates the effectiveness of our approach. We implement an end-to-end framework and conduct extensive experiments to evaluate our method, showcasing competitive performance on conventional entity alignment and entity prediction tasks. Furthermore, our method significantly outperforms existing methods in open-world settings.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源