Addressing Noise and Efficiency Issues in Graph-Based Machine Learning
Models From the Perspective of Adversarial Attack
- URL: http://arxiv.org/abs/2401.15615v1
- Date: Sun, 28 Jan 2024 10:03:37 GMT
- Title: Addressing Noise and Efficiency Issues in Graph-Based Machine Learning
Models From the Perspective of Adversarial Attack
- Authors: Yongyu Wang
- Abstract summary: We propose treating noisy edges as adversarial attack and use a spectral adversarial robustness evaluation method to diminish the impact of noisy edges on the performance of graph algorithms.
Our method identifies those points that are less vulnerable to noisy edges and leverages only these robust points to perform graph-based algorithms.
- Score: 2.1937382384136637
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Given that no existing graph construction method can generate a perfect graph
for a given dataset, graph-based algorithms are invariably affected by the
plethora of redundant and erroneous edges present within the constructed
graphs. In this paper, we propose treating these noisy edges as adversarial
attack and use a spectral adversarial robustness evaluation method to diminish
the impact of noisy edges on the performance of graph algorithms. Our method
identifies those points that are less vulnerable to noisy edges and leverages
only these robust points to perform graph-based algorithms. Our experiments
with spectral clustering, one of the most representative and widely utilized
graph algorithms, reveal that our methodology not only substantially elevates
the precision of the algorithm but also greatly accelerates its computational
efficiency by leveraging only a select number of robust data points.
Related papers
- The Graph Lottery Ticket Hypothesis: Finding Sparse, Informative Graph
Structure [18.00833762891405]
Graph Lottery Ticket (GLT) Hypothesis: There is an extremely sparse backbone for every graph.
We study 8 key metrics of interest that directly influence the performance of graph learning algorithms.
We propose a straightforward and efficient algorithm for finding these GLTs in arbitrary graphs.
arXiv Detail & Related papers (2023-12-08T00:24:44Z) - Uncertainty-Aware Robust Learning on Noisy Graphs [16.66112191539017]
This paper proposes a novel uncertainty-aware graph learning framework motivated by distributionally robust optimization.
Specifically, we use a graph neural network-based encoder to embed the node features and find the optimal node embeddings.
Such an uncertainty-aware learning process leads to improved node representations and a more robust graph predictive model.
arXiv Detail & Related papers (2023-06-14T02:45:14Z) - Bures-Wasserstein Means of Graphs [60.42414991820453]
We propose a novel framework for defining a graph mean via embeddings in the space of smooth graph signal distributions.
By finding a mean in this embedding space, we can recover a mean graph that preserves structural information.
We establish the existence and uniqueness of the novel graph mean, and provide an iterative algorithm for computing it.
arXiv Detail & Related papers (2023-05-31T11:04:53Z) - Improved Exact and Heuristic Algorithms for Maximum Weight Clique [1.2074552857379273]
We propose improved exact and labeling algorithms for solving the maximum weight clique problem.
Our algorithms interleave successful techniques with novel data reduction rules that use local graph structure.
We evaluate our algorithms on a range of synthetic and real-world graphs, and find that they outperform the current state of the art on most inputs.
arXiv Detail & Related papers (2023-02-01T14:02:06Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Score matching enables causal discovery of nonlinear additive noise
models [63.93669924730725]
We show how to design a new generation of scalable causal discovery methods.
We propose a new efficient method for approximating the score's Jacobian, enabling to recover the causal graph.
arXiv Detail & Related papers (2022-03-08T21:34:46Z) - Bayesian Graph Contrastive Learning [55.36652660268726]
We propose a novel perspective of graph contrastive learning methods showing random augmentations leads to encoders.
Our proposed method represents each node by a distribution in the latent space in contrast to existing techniques which embed each node to a deterministic vector.
We show a considerable improvement in performance compared to existing state-of-the-art methods on several benchmark datasets.
arXiv Detail & Related papers (2021-12-15T01:45:32Z) - Graph Information Bottleneck for Subgraph Recognition [103.37499715761784]
We propose a framework of Graph Information Bottleneck (GIB) for the subgraph recognition problem in deep graph learning.
Under this framework, one can recognize the maximally informative yet compressive subgraph, named IB-subgraph.
We evaluate the properties of the IB-subgraph in three application scenarios: improvement of graph classification, graph interpretation and graph denoising.
arXiv Detail & Related papers (2020-10-12T09:32:20Z) - Faster Graph Embeddings via Coarsening [25.37181684580123]
Graph embeddings are a ubiquitous tool for machine learning tasks, such as node classification and link prediction, on graph-structured data.
computing the embeddings for large-scale graphs is prohibitively inefficient even if we are interested only in a small subset of relevant vertices.
We present an efficient graph coarsening approach, based on Schur complements, for computing the embedding of the relevant vertices.
arXiv Detail & Related papers (2020-07-06T15:22:25Z) - Online Dense Subgraph Discovery via Blurred-Graph Feedback [87.9850024070244]
We introduce a novel learning problem for dense subgraph discovery.
We first propose a edge-time algorithm that obtains a nearly-optimal solution with high probability.
We then design a more scalable algorithm with a theoretical guarantee.
arXiv Detail & Related papers (2020-06-24T11:37:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.