Compressive Recovery of Signals Defined on Perturbed Graphs
- URL: http://arxiv.org/abs/2402.07637v2
- Date: Fri, 16 Feb 2024 05:01:02 GMT
- Title: Compressive Recovery of Signals Defined on Perturbed Graphs
- Authors: Sabyasachi Ghosh and Ajit Rajwade
- Abstract summary: We present an algorithm which simultaneously recovers the signal from the compressive measurements and also corrects the graph perturbations.
An application to compressive image reconstruction is also presented, where graph perturbations are modeled as undesirable graph edges linking pixels with significant intensity difference.
- Score: 4.021249101488848
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recovery of signals with elements defined on the nodes of a graph, from
compressive measurements is an important problem, which can arise in various
domains such as sensor networks, image reconstruction and group testing. In
some scenarios, the graph may not be accurately known, and there may exist a
few edge additions or deletions relative to a ground truth graph. Such
perturbations, even if small in number, significantly affect the Graph Fourier
Transform (GFT). This impedes recovery of signals which may have sparse
representations in the GFT bases of the ground truth graph. We present an
algorithm which simultaneously recovers the signal from the compressive
measurements and also corrects the graph perturbations. We analyze some
important theoretical properties of the algorithm. Our approach to correction
for graph perturbations is based on model selection techniques such as
cross-validation in compressed sensing. We validate our algorithm on signals
which have a sparse representation in the GFT bases of many commonly used
graphs in the network science literature. An application to compressive image
reconstruction is also presented, where graph perturbations are modeled as
undesirable graph edges linking pixels with significant intensity difference.
In all experiments, our algorithm clearly outperforms baseline techniques which
either ignore the perturbations or use first order approximations to the
perturbations in the GFT bases.
Related papers
- Geometric instability of graph neural networks on large graphs [0.0]
We analyse the geometric instability of embeddings produced by graph neural networks (GNNs)
We propose a simple, efficient and graph-native Graph Gram Index (GGI) to measure such instability.
This allows us to study the varying instability behaviour of GNN embeddings on large graphs for both node classification and link prediction.
arXiv Detail & Related papers (2023-08-19T20:10:54Z) - Graph Laplacian Learning with Exponential Family Noise [8.594140167290098]
We propose a versatile graph inference framework for learning from graph signals corrupted by exponential family noise.
Our framework generalizes previous methods from continuous smooth graph signals to various data types.
arXiv Detail & Related papers (2023-06-14T02:09:52Z) - Gradient scarcity with Bilevel Optimization for Graph Learning [0.0]
gradient scarcity occurs when learning graphs by minimizing a loss on a subset of nodes causes edges between unlabelled nodes that are far from labelled ones to receive zero gradients.
We give a precise mathematical characterization of this phenomenon, and prove it also emerges in bilevel optimization.
To alleviate this issue, we study several solutions: we propose to resort to latent graph learning using a Graph-to-Graph model (G2G), graph regularization to impose a prior structure on the graph, or optimizing on a larger graph than the original one with a reduced diameter.
arXiv Detail & Related papers (2023-03-24T12:37:43Z) - Graph Signal Sampling for Inductive One-Bit Matrix Completion: a
Closed-form Solution [112.3443939502313]
We propose a unified graph signal sampling framework which enjoys the benefits of graph signal analysis and processing.
The key idea is to transform each user's ratings on the items to a function (signal) on the vertices of an item-item graph.
For the online setting, we develop a Bayesian extension, i.e., BGS-IMC which considers continuous random Gaussian noise in the graph Fourier domain.
arXiv Detail & Related papers (2023-02-08T08:17:43Z) - Graphon Pooling for Reducing Dimensionality of Signals and Convolutional
Operators on Graphs [131.53471236405628]
We present three methods that exploit the induced graphon representation of graphs and graph signals on partitions of [0, 1]2 in the graphon space.
We prove that those low dimensional representations constitute a convergent sequence of graphs and graph signals.
We observe that graphon pooling performs significantly better than other approaches proposed in the literature when dimensionality reduction ratios between layers are large.
arXiv Detail & Related papers (2022-12-15T22:11:34Z) - FoSR: First-order spectral rewiring for addressing oversquashing in GNNs [0.0]
Graph neural networks (GNNs) are able to leverage the structure of graph data by passing messages along the edges of the graph.
We propose a computationally efficient algorithm that prevents oversquashing by systematically adding edges to the graph.
We find experimentally that our algorithm outperforms existing graph rewiring methods in several graph classification tasks.
arXiv Detail & Related papers (2022-10-21T07:58:03Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Spectral Graph Convolutional Networks With Lifting-based Adaptive Graph
Wavelets [81.63035727821145]
Spectral graph convolutional networks (SGCNs) have been attracting increasing attention in graph representation learning.
We propose a novel class of spectral graph convolutional networks that implement graph convolutions with adaptive graph wavelets.
arXiv Detail & Related papers (2021-08-03T17:57:53Z) - Unrolling of Deep Graph Total Variation for Image Denoising [106.93258903150702]
In this paper, we combine classical graph signal filtering with deep feature learning into a competitive hybrid design.
We employ interpretable analytical low-pass graph filters and employ 80% fewer network parameters than state-of-the-art DL denoising scheme DnCNN.
arXiv Detail & Related papers (2020-10-21T20:04:22Z) - Graphon Pooling in Graph Neural Networks [169.09536309161314]
Graph neural networks (GNNs) have been used effectively in different applications involving the processing of signals on irregular structures modeled by graphs.
We propose a new strategy for pooling and sampling on GNNs using graphons which preserves the spectral properties of the graph.
arXiv Detail & Related papers (2020-03-03T21:04:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.