论文标题

用于训练前图神经网络的神经图匹配

Neural Graph Matching for Pre-training Graph Neural Networks

论文作者

Hou, Yupeng, Hu, Binbin, Zhao, Wayne Xin, Zhang, Zhiqiang, Zhou, Jun, Wen, Ji-Rong

论文摘要

最近,图形神经网络(GNN)在建模结构数据时已显示出强大的能力。但是,当适应下游任务时,通常需要大量的特定任务标签数据,这在实践中可能非常稀缺。数据稀缺性的一个有希望的解决方案是在大量未标记的图形或粗粒标记的图表上预先培训可转移和表达的GNN模型。然后,在具有特定于任务的细粒标签的下游数据集上对预训练的GNN进行微调。在本文中,我们提出了一个新型的基于GNN的GNN预训练框架,称为GMPT。着眼于一对图,我们建议通过神经图匹配来学习它们之间的结构对应关系,包括盖纸信息传递和间段消息传递。通过这种方式,当与不同的图配对时,我们可以学习给定图的自适应表示,并且在单个预训练任务中自然考虑了节点和图形级特征。所提出的方法可以应用于完全自我监管的预训练和粗粒的监督预训练。我们进一步提出了一种近似的对比培训策略,以大大减少时间/记忆消耗。关于多域,分布外基准测试的广泛实验已经证明了我们方法的有效性。该代码可在以下网址提供:https://github.com/rucaibox/gmpt。

Recently, graph neural networks (GNNs) have been shown powerful capacity at modeling structural data. However, when adapted to downstream tasks, it usually requires abundant task-specific labeled data, which can be extremely scarce in practice. A promising solution to data scarcity is to pre-train a transferable and expressive GNN model on large amounts of unlabeled graphs or coarse-grained labeled graphs. Then the pre-trained GNN is fine-tuned on downstream datasets with task-specific fine-grained labels. In this paper, we present a novel Graph Matching based GNN Pre-Training framework, called GMPT. Focusing on a pair of graphs, we propose to learn structural correspondences between them via neural graph matching, consisting of both intra-graph message passing and inter-graph message passing. In this way, we can learn adaptive representations for a given graph when paired with different graphs, and both node- and graph-level characteristics are naturally considered in a single pre-training task. The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training. We further propose an approximate contrastive training strategy to significantly reduce time/memory consumption. Extensive experiments on multi-domain, out-of-distribution benchmarks have demonstrated the effectiveness of our approach. The code is available at: https://github.com/RUCAIBox/GMPT.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源