ASGNN: Graph Neural Networks with Adaptive Structure
- URL: http://arxiv.org/abs/2210.01002v1
- Date: Mon, 3 Oct 2022 15:10:40 GMT
- Title: ASGNN: Graph Neural Networks with Adaptive Structure
- Authors: Zepeng Zhang, Songtao Lu, Zengfeng Huang, Ziping Zhao
- Abstract summary: We propose a novel interpretable message passing scheme with adaptive structure (ASMP) to defend against adversarial attacks on graph structure.
ASMP is adaptive in the sense that the message passing process in different layers is able to be carried out over dynamically adjusted graphs.
- Score: 41.83813812538167
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The graph neural network (GNN) models have presented impressive achievements
in numerous machine learning tasks. However, many existing GNN models are shown
to be vulnerable to adversarial attacks, which creates a stringent need to
build robust GNN architectures. In this work, we propose a novel interpretable
message passing scheme with adaptive structure (ASMP) to defend against
adversarial attacks on graph structure. Layers in ASMP are derived based on
optimization steps that minimize an objective function that learns the node
feature and the graph structure simultaneously. ASMP is adaptive in the sense
that the message passing process in different layers is able to be carried out
over dynamically adjusted graphs. Such property allows more fine-grained
handling of the noisy (or perturbed) graph structure and hence improves the
robustness. Convergence properties of the ASMP scheme are theoretically
established. Integrating ASMP with neural networks can lead to a new family of
GNN models with adaptive structure (ASGNN). Extensive experiments on
semi-supervised node classification tasks demonstrate that the proposed ASGNN
outperforms the state-of-the-art GNN architectures in terms of classification
performance under various adversarial attacks.
Related papers
- AdaRC: Mitigating Graph Structure Shifts during Test-Time [66.40525136929398]
Test-time adaptation (TTA) has attracted attention due to its ability to adapt a pre-trained model to a target domain without re-accessing the source domain.
We propose AdaRC, an innovative framework designed for effective and efficient adaptation to structure shifts in graphs.
arXiv Detail & Related papers (2024-10-09T15:15:40Z) - SFR-GNN: Simple and Fast Robust GNNs against Structural Attacks [13.30477801940754]
Graph Neural Networks (GNNs) have demonstrated commendable performance for graph-structured data.
GNNs are often vulnerable to adversarial structural attacks as embedding generation relies on graph topology.
We propose an efficient defense method, called Simple and Fast Robust Graph Neural Network (SFR-GNN), supported by mutual information theory.
arXiv Detail & Related papers (2024-08-29T13:52:28Z) - Graph Neural Networks Gone Hogwild [14.665528337423249]
Message passing graph neural networks (GNNs) generate catastrophically incorrect predictions when nodes update asynchronously during inference.
In this work we identify "implicitly-defined" GNNs as a class of architectures which is provably robust to partially asynchronous "hogwild" inference.
We then propose a novel implicitly-defined GNN architecture, which we call an energy GNN.
arXiv Detail & Related papers (2024-06-29T17:11:09Z) - Learning Invariant Representations of Graph Neural Networks via Cluster
Generalization [58.68231635082891]
Graph neural networks (GNNs) have become increasingly popular in modeling graph-structured data.
In this paper, we experimentally find that the performance of GNNs drops significantly when the structure shift happens.
We propose the Cluster Information Transfer (CIT) mechanism, which can learn invariant representations for GNNs.
arXiv Detail & Related papers (2024-03-06T10:36:56Z) - Attentional Graph Neural Networks for Robust Massive Network
Localization [20.416879207269446]
Graph neural networks (GNNs) have emerged as a prominent tool for classification tasks in machine learning.
This paper integrates GNNs with attention mechanism to tackle a challenging nonlinear regression problem: network localization.
We first introduce a novel network localization method based on graph convolutional network (GCN), which exhibits exceptional precision even under severe non-line-of-sight (NLOS) conditions.
arXiv Detail & Related papers (2023-11-28T15:05:13Z) - MuseGNN: Interpretable and Convergent Graph Neural Network Layers at
Scale [15.93424606182961]
We propose a sampling-based energy function and scalable GNN layers that iteratively reduce it, guided by convergence guarantees in certain settings.
We also instantiate a full GNN architecture based on these designs, and the model achieves competitive accuracy and scalability when applied to the largest publicly-available node classification benchmark exceeding 1TB in size.
arXiv Detail & Related papers (2023-10-19T04:30:14Z) - ACE-HGNN: Adaptive Curvature Exploration Hyperbolic Graph Neural Network [72.16255675586089]
We propose an Adaptive Curvature Exploration Hyperbolic Graph NeuralNetwork named ACE-HGNN to adaptively learn the optimal curvature according to the input graph and downstream tasks.
Experiments on multiple real-world graph datasets demonstrate a significant and consistent performance improvement in model quality with competitive performance and good generalization ability.
arXiv Detail & Related papers (2021-10-15T07:18:57Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Hierarchical Message-Passing Graph Neural Networks [12.207978823927386]
We propose a novel Hierarchical Message-passing Graph Neural Networks framework.
Key idea is generating a hierarchical structure that re-organises all nodes in a flat graph into multi-level super graphs.
We present the first model to implement this framework, termed Hierarchical Community-aware Graph Neural Network (HC-GNN)
arXiv Detail & Related papers (2020-09-08T13:11:07Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.