Are All Edges Necessary? A Unified Framework for Graph Purification
- URL: http://arxiv.org/abs/2211.05184v1
- Date: Wed, 9 Nov 2022 20:28:25 GMT
- Title: Are All Edges Necessary? A Unified Framework for Graph Purification
- Authors: Zishan Gu, Jintang Li and Liang Chen
- Abstract summary: Not all edges in a graph are necessary for the training of machine learning models.
In this paper, we try to provide a method to drop edges in order to purify the graph data from a new perspective.
- Score: 6.795209119198288
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph Neural Networks (GNNs) as deep learning models working on
graph-structure data have achieved advanced performance in many works. However,
it has been proved repeatedly that, not all edges in a graph are necessary for
the training of machine learning models. In other words, some of the
connections between nodes may bring redundant or even misleading information to
downstream tasks. In this paper, we try to provide a method to drop edges in
order to purify the graph data from a new perspective. Specifically, it is a
framework to purify graphs with the least loss of information, under which the
core problems are how to better evaluate the edges and how to delete the
relatively redundant edges with the least loss of information. To address the
above two problems, we propose several measurements for the evaluation and
different judges and filters for the edge deletion. We also introduce a
residual-iteration strategy and a surrogate model for measurements requiring
unknown information. The experimental results show that our proposed
measurements for KL divergence with constraints to maintain the connectivity of
the graph and delete edges in an iterative way can find out the most edges
while keeping the performance of GNNs. What's more, further experiments show
that this method also achieves the best defense performance against adversarial
attacks.
Related papers
- Why Does Dropping Edges Usually Outperform Adding Edges in Graph Contrastive Learning? [54.44813218411879]
We introduce a new metric, namely Error Passing Rate (EPR), to quantify how a graph fits the network.
Inspired by the theoretical conclusions and the idea of positive-incentive noise, we propose a novel GCL algorithm, Error-PAssing-based Graph Contrastive Learning (EPAGCL)
We generate views by adding and dropping edges based on the weights derived from EPR.
arXiv Detail & Related papers (2024-12-11T06:31:06Z) - ADEdgeDrop: Adversarial Edge Dropping for Robust Graph Neural Networks [53.41164429486268]
Graph Neural Networks (GNNs) have exhibited the powerful ability to gather graph-structured information from neighborhood nodes.
The performance of GNNs is limited by poor generalization and fragile robustness caused by noisy and redundant graph data.
We propose a novel adversarial edge-dropping method (ADEdgeDrop) that leverages an adversarial edge predictor guiding the removal of edges.
arXiv Detail & Related papers (2024-03-14T08:31:39Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Learnable Graph Matching: A Practical Paradigm for Data Association [74.28753343714858]
We propose a general learnable graph matching method to address these issues.
Our method achieves state-of-the-art performance on several MOT datasets.
For image matching, our method outperforms state-of-the-art methods on a popular indoor dataset, ScanNet.
arXiv Detail & Related papers (2023-03-27T17:39:00Z) - Graph Classification via Discriminative Edge Feature Learning [17.86550507456848]
Spectral graph convolutional neural networks (GCNNs) have been producing encouraging results in graph classification tasks.
We design an edge feature scheme and an add-on layer between every two stacked graph convolution layers in GCNN.
Our method outperforms state-of-the-art graph classification methods on three new graph datasets.
arXiv Detail & Related papers (2022-10-05T07:30:21Z) - Certified Graph Unlearning [39.29148804411811]
Graph-structured data is ubiquitous in practice and often processed using graph neural networks (GNNs)
We introduce the first known framework for emph certified graph unlearning of GNNs.
Three different types of unlearning requests need to be considered, including node feature, edge and node unlearning.
arXiv Detail & Related papers (2022-06-18T07:41:10Z) - Training Robust Graph Neural Networks with Topology Adaptive Edge
Dropping [116.26579152942162]
Graph neural networks (GNNs) are processing architectures that exploit graph structural information to model representations from network data.
Despite their success, GNNs suffer from sub-optimal generalization performance given limited training data.
This paper proposes Topology Adaptive Edge Dropping to improve generalization performance and learn robust GNN models.
arXiv Detail & Related papers (2021-06-05T13:20:36Z) - GraphMI: Extracting Private Graph Data from Graph Neural Networks [59.05178231559796]
We present textbfGraph textbfModel textbfInversion attack (GraphMI), which aims to extract private graph data of the training graph by inverting GNN.
Specifically, we propose a projected gradient module to tackle the discreteness of graph edges while preserving the sparsity and smoothness of graph features.
We design a graph auto-encoder module to efficiently exploit graph topology, node attributes, and target model parameters for edge inference.
arXiv Detail & Related papers (2021-06-05T07:07:52Z) - Co-embedding of Nodes and Edges with Graph Neural Networks [13.020745622327894]
Graph embedding is a way to transform and encode the data structure in high dimensional and non-Euclidean feature space.
CensNet is a general graph embedding framework, which embeds both nodes and edges to a latent feature space.
Our approach achieves or matches the state-of-the-art performance in four graph learning tasks.
arXiv Detail & Related papers (2020-10-25T22:39:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.