论文标题

NDGGNET-A节点独立门基于图形神经网络

NDGGNET-A Node Independent Gate based Graph Neural Networks

论文作者

Tang, Ye, Yang, Xuesong, Liu, Xinrui, Zhao, Xiwei, Lin, Zhangang, Peng, Changping

论文摘要

图神经网络(GNN)是结构数据的架构,已在大量任务中采用并获得了出色的结果,例如链接预测,节点分类,图形分类等。通常,对于给定图中的某个节点,传统的GNN层可以被视为单跳邻居的聚集,因此一组堆叠的层能够在多跳中获取和更新节点状态。对于连通性稀疏的节点,很难通过单个GNN层获得足够的信息,因为不仅只有很少直接连接到它们的节点,而且无法传播高阶邻居信息。但是,随着层数的增加,GNN模型容易容易过度光滑的节点,并具有致密连通性的节点,从而导致精度降低。为了解决这个问题,在本文中,我们定义了一个新颖的框架,该框架允许普通的GNN模型容纳更多层。具体而言,采用基于节点的门来动态调整层的重量,以增强信息聚合能力并降低过度平滑的概率。实验结果表明,我们提出的模型可以有效地增加模型深度并在几个数据集上表现良好。

Graph Neural Networks (GNNs) is an architecture for structural data, and has been adopted in a mass of tasks and achieved fabulous results, such as link prediction, node classification, graph classification and so on. Generally, for a certain node in a given graph, a traditional GNN layer can be regarded as an aggregation from one-hop neighbors, thus a set of stacked layers are able to fetch and update node status within multi-hops. For nodes with sparse connectivity, it is difficult to obtain enough information through a single GNN layer as not only there are only few nodes directly connected to them but also can not propagate the high-order neighbor information. However, as the number of layer increases, the GNN model is prone to over-smooth for nodes with the dense connectivity, which resulting in the decrease of accuracy. To tackle this issue, in this thesis, we define a novel framework that allows the normal GNN model to accommodate more layers. Specifically, a node-degree based gate is employed to adjust weight of layers dynamically, that try to enhance the information aggregation ability and reduce the probability of over-smoothing. Experimental results show that our proposed model can effectively increase the model depth and perform well on several datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源