Beyond Localized Graph Neural Networks: An Attributed Motif
Regularization Framework
- URL: http://arxiv.org/abs/2009.05197v1
- Date: Fri, 11 Sep 2020 02:03:09 GMT
- Title: Beyond Localized Graph Neural Networks: An Attributed Motif
Regularization Framework
- Authors: Aravind Sankar, Junting Wang, Adit Krishnan, Hari Sundaram
- Abstract summary: InfoMotif is a new semi-supervised, motif-regularized, learning framework over graphs.
We overcome two key limitations of message passing in graph neural networks (GNNs)
We show significant gains (3-10% accuracy) across six diverse, real-world datasets.
- Score: 6.790281989130923
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present InfoMotif, a new semi-supervised, motif-regularized, learning
framework over graphs. We overcome two key limitations of message passing in
popular graph neural networks (GNNs): localization (a k-layer GNN cannot
utilize features outside the k-hop neighborhood of the labeled training nodes)
and over-smoothed (structurally indistinguishable) representations. We propose
the concept of attributed structural roles of nodes based on their occurrence
in different network motifs, independent of network proximity. Two nodes share
attributed structural roles if they participate in topologically similar motif
instances over co-varying sets of attributes. Further, InfoMotif achieves
architecture independence by regularizing the node representations of arbitrary
GNNs via mutual information maximization. Our training curriculum dynamically
prioritizes multiple motifs in the learning process without relying on
distributional assumptions in the underlying graph or the learning task. We
integrate three state-of-the-art GNNs in our framework, to show significant
gains (3-10% accuracy) across six diverse, real-world datasets. We see stronger
gains for nodes with sparse training labels and diverse attributes in local
neighborhood structures.
Related papers
- Harnessing Collective Structure Knowledge in Data Augmentation for Graph Neural Networks [25.12261412297796]
Graph neural networks (GNNs) have achieved state-of-the-art performance in graph representation learning.
We propose a novel approach, namely collective structure knowledge-augmented graph neural network (CoS-GNN)
arXiv Detail & Related papers (2024-05-17T08:50:00Z) - Learning Invariant Representations of Graph Neural Networks via Cluster
Generalization [58.68231635082891]
Graph neural networks (GNNs) have become increasingly popular in modeling graph-structured data.
In this paper, we experimentally find that the performance of GNNs drops significantly when the structure shift happens.
We propose the Cluster Information Transfer (CIT) mechanism, which can learn invariant representations for GNNs.
arXiv Detail & Related papers (2024-03-06T10:36:56Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - Collaborative Graph Neural Networks for Attributed Network Embedding [63.39495932900291]
Graph neural networks (GNNs) have shown prominent performance on attributed network embedding.
We propose COllaborative graph Neural Networks--CONN, a tailored GNN architecture for network embedding.
arXiv Detail & Related papers (2023-07-22T04:52:27Z) - A Variational Edge Partition Model for Supervised Graph Representation
Learning [51.30365677476971]
This paper introduces a graph generative process to model how the observed edges are generated by aggregating the node interactions over a set of overlapping node communities.
We partition each edge into the summation of multiple community-specific weighted edges and use them to define community-specific GNNs.
A variational inference framework is proposed to jointly learn a GNN based inference network that partitions the edges into different communities, these community-specific GNNs, and a GNN based predictor that combines community-specific GNNs for the end classification task.
arXiv Detail & Related papers (2022-02-07T14:37:50Z) - Convolutional Neural Network Dynamics: A Graph Perspective [39.81881710355496]
We take a graph perspective and investigate the relationship between the graph structure of NNs and their performance.
For the dynamic graph representation of NNs, we explore structural representations for fully-connected and convolutional layers.
Our analysis shows that a simple summary of graph statistics can be used to accurately predict the performance of NNs.
arXiv Detail & Related papers (2021-11-09T20:38:48Z) - DPGNN: Dual-Perception Graph Neural Network for Representation Learning [21.432960458513826]
Graph neural networks (GNNs) have drawn increasing attention in recent years and achieved remarkable performance in many graph-based tasks.
Most existing GNNs are based on the message-passing paradigm to iteratively aggregate neighborhood information in a single topology space.
We present a novel message-passing paradigm, based on the properties of multi-step message source, node-specific message output, and multi-space message interaction.
arXiv Detail & Related papers (2021-10-15T05:47:26Z) - Reinforced Neighborhood Selection Guided Multi-Relational Graph Neural
Networks [68.9026534589483]
RioGNN is a novel Reinforced, recursive and flexible neighborhood selection guided multi-relational Graph Neural Network architecture.
RioGNN can learn more discriminative node embedding with enhanced explainability due to the recognition of individual importance of each relation.
arXiv Detail & Related papers (2021-04-16T04:30:06Z) - Hierarchical Message-Passing Graph Neural Networks [12.207978823927386]
We propose a novel Hierarchical Message-passing Graph Neural Networks framework.
Key idea is generating a hierarchical structure that re-organises all nodes in a flat graph into multi-level super graphs.
We present the first model to implement this framework, termed Hierarchical Community-aware Graph Neural Network (HC-GNN)
arXiv Detail & Related papers (2020-09-08T13:11:07Z) - EdgeNets:Edge Varying Graph Neural Networks [179.99395949679547]
This paper puts forth a general framework that unifies state-of-the-art graph neural networks (GNNs) through the concept of EdgeNet.
An EdgeNet is a GNN architecture that allows different nodes to use different parameters to weigh the information of different neighbors.
This is a general linear and local operation that a node can perform and encompasses under one formulation all existing graph convolutional neural networks (GCNNs) as well as graph attention networks (GATs)
arXiv Detail & Related papers (2020-01-21T15:51:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.