论文标题

重构GNNS:从消息传播的角度来重新审视基于分解的模型

ReFactor GNNs: Revisiting Factorisation-based Models from a Message-Passing Perspective

论文作者

Chen, Yihong, Mishra, Pushkar, Franceschi, Luca, Minervini, Pasquale, Stenetorp, Pontus, Riedel, Sebastian

论文摘要

基于分解的模型(FMS),例如DistMult,在知识图完成(KGC)任务中享有持久的成功,通常超过了图形神经网络(GNNS)。但是,与GNN不同,FMS难以合并节点特征并推广到在归纳环境中看不见的节点。我们的工作通过提出重构GNN来弥合FMS和GNN之间的差距。这种新的体系结构借鉴了两种建模范式,以前在很大程度上被认为是不结合的。具体而言,我们使用消息通讯的形式主义,通过将梯度下降程序重新定义为消息传播操作,以表明如何将FMS施加为GNN,这构成了我们重构GNN的基础。在众多成熟的KGC基准测试中,我们的重构GNN可以实现与FMS相当的转导性能,以及最先进的归纳性能,同时使用较小的参数阶数。

Factorisation-based Models (FMs), such as DistMult, have enjoyed enduring success for Knowledge Graph Completion (KGC) tasks, often outperforming Graph Neural Networks (GNNs). However, unlike GNNs, FMs struggle to incorporate node features and generalise to unseen nodes in inductive settings. Our work bridges the gap between FMs and GNNs by proposing ReFactor GNNs. This new architecture draws upon both modelling paradigms, which previously were largely thought of as disjoint. Concretely, using a message-passing formalism, we show how FMs can be cast as GNNs by reformulating the gradient descent procedure as message-passing operations, which forms the basis of our ReFactor GNNs. Across a multitude of well-established KGC benchmarks, our ReFactor GNNs achieve comparable transductive performance to FMs, and state-of-the-art inductive performance while using an order of magnitude fewer parameters.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源