inGRASS: Incremental Graph Spectral Sparsification via
Low-Resistance-Diameter Decomposition
- URL: http://arxiv.org/abs/2402.16990v1
- Date: Mon, 26 Feb 2024 19:49:54 GMT
- Title: inGRASS: Incremental Graph Spectral Sparsification via
Low-Resistance-Diameter Decomposition
- Authors: Ali Aghdaei and Zhuo Feng
- Abstract summary: inGRASS is a novel algorithm designed for incremental spectral sparsification of large undirected graphs.
The proposed algorithm is highly scalable and parallel-friendly, having a nearly-linear time complexity for the setup phase.
In experiments, inGRASS achieves up to over $200 times$ speedups while retaining comparable solution quality.
- Score: 5.457150493905064
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: This work presents inGRASS, a novel algorithm designed for incremental
spectral sparsification of large undirected graphs. The proposed inGRASS
algorithm is highly scalable and parallel-friendly, having a nearly-linear time
complexity for the setup phase and the ability to update the spectral
sparsifier in $O(\log N)$ time for each incremental change made to the original
graph with $N$ nodes. A key component in the setup phase of inGRASS is a
multilevel resistance embedding framework introduced for efficiently
identifying spectrally-critical edges and effectively detecting redundant ones,
which is achieved by decomposing the initial sparsifier into many node clusters
with bounded effective-resistance diameters leveraging a
low-resistance-diameter decomposition (LRD) scheme. The update phase of inGRASS
exploits low-dimensional node embedding vectors for efficiently estimating the
importance and uniqueness of each newly added edge. As demonstrated through
extensive experiments, inGRASS achieves up to over $200 \times$ speedups while
retaining comparable solution quality in incremental spectral sparsification of
graphs obtained from various datasets, such as circuit simulations, finite
element analysis, and social networks.
Related papers
- Efficient Heterogeneous Graph Learning via Random Projection [65.65132884606072]
Heterogeneous Graph Neural Networks (HGNNs) are powerful tools for deep learning on heterogeneous graphs.
Recent pre-computation-based HGNNs use one-time message passing to transform a heterogeneous graph into regular-shaped tensors.
We propose a hybrid pre-computation-based HGNN, named Random Projection Heterogeneous Graph Neural Network (RpHGNN)
arXiv Detail & Related papers (2023-10-23T01:25:44Z) - Accelerating Scalable Graph Neural Network Inference with Node-Adaptive
Propagation [80.227864832092]
Graph neural networks (GNNs) have exhibited exceptional efficacy in a diverse array of applications.
The sheer size of large-scale graphs presents a significant challenge to real-time inference with GNNs.
We propose an online propagation framework and two novel node-adaptive propagation methods.
arXiv Detail & Related papers (2023-10-17T05:03:00Z) - A Graph Encoder-Decoder Network for Unsupervised Anomaly Detection [7.070726553564701]
We propose an unsupervised graph encoder-decoder model to detect abnormal nodes from graphs.
In the encoding stage, we design a novel pooling mechanism, named LCPool, to find a cluster assignment matrix.
In the decoding stage, we propose an unpooling operation, called LCUnpool, to reconstruct both the structure and nodal features of the original graph.
arXiv Detail & Related papers (2023-08-15T13:49:12Z) - SF-SGL: Solver-Free Spectral Graph Learning from Linear Measurements [16.313489255657203]
spectral graph densification framework (SGL) for learning resistor networks with linear measurements.
solver-free method (SF-SGL) that exploits multilevel spectral approximation of the graphs.
EDA algorithm for vectorless power/thermal integrity verifications.
arXiv Detail & Related papers (2023-02-09T00:33:19Z) - Low-complexity Near-optimum Symbol Detection Based on Neural Enhancement
of Factor Graphs [2.030567625639093]
We consider the application of the factor graph framework for symbol detection on linear inter-symbol interference channels.
We develop and evaluate strategies to improve the performance of the factor graph-based symbol detection by means of neural enhancement.
arXiv Detail & Related papers (2022-03-30T15:58:53Z) - Unfolding Projection-free SDP Relaxation of Binary Graph Classifier via
GDPA Linearization [59.87663954467815]
Algorithm unfolding creates an interpretable and parsimonious neural network architecture by implementing each iteration of a model-based algorithm as a neural layer.
In this paper, leveraging a recent linear algebraic theorem called Gershgorin disc perfect alignment (GDPA), we unroll a projection-free algorithm for semi-definite programming relaxation (SDR) of a binary graph.
Experimental results show that our unrolled network outperformed pure model-based graph classifiers, and achieved comparable performance to pure data-driven networks but using far fewer parameters.
arXiv Detail & Related papers (2021-09-10T07:01:15Z) - Gradient Boosted Binary Histogram Ensemble for Large-scale Regression [60.16351608335641]
We propose a gradient boosting algorithm for large-scale regression problems called textitGradient Boosted Binary Histogram Ensemble (GBBHE) based on binary histogram partition and ensemble learning.
In the experiments, compared with other state-of-the-art algorithms such as gradient boosted regression tree (GBRT), our GBBHE algorithm shows promising performance with less running time on large-scale datasets.
arXiv Detail & Related papers (2021-06-03T17:05:40Z) - SGL: Spectral Graph Learning from Measurements [4.721069729610892]
This work introduces a highly scalable spectral graph densification framework for learning resistor networks with linear measurements.
We show that the proposed approach is highly scalable for learning ultra-sparse resistor networks without sacrificing solution quality.
arXiv Detail & Related papers (2021-04-16T03:01:15Z) - SF-GRASS: Solver-Free Graph Spectral Sparsification [16.313489255657203]
This work introduces a solver-free approach (SF-GRASS) for spectral graph sparsification by leveraging emerging spectral graph coarsening and graph signal processing techniques.
We show that the proposed method can produce a hierarchy of high-quality spectral sparsifiers in nearly-linear time for a variety of real-world, large-scale graphs and circuit networks.
arXiv Detail & Related papers (2020-08-17T21:37:19Z) - Fast Graph Attention Networks Using Effective Resistance Based Graph
Sparsification [70.50751397870972]
FastGAT is a method to make attention based GNNs lightweight by using spectral sparsification to generate an optimal pruning of the input graph.
We experimentally evaluate FastGAT on several large real world graph datasets for node classification tasks.
arXiv Detail & Related papers (2020-06-15T22:07:54Z) - Efficient and Stable Graph Scattering Transforms via Pruning [86.76336979318681]
Graph scattering transforms ( GSTs) offer training-free deep GCN models that extract features from graph data.
The price paid by GSTs is exponential complexity in space and time that increases with the number of layers.
The present work addresses the complexity limitation of GSTs by introducing an efficient so-termed pruned (p) GST approach.
arXiv Detail & Related papers (2020-01-27T16:05:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.