Unrolling of Deep Graph Total Variation for Image Denoising
- URL: http://arxiv.org/abs/2010.11290v2
- Date: Wed, 24 Mar 2021 02:04:56 GMT
- Title: Unrolling of Deep Graph Total Variation for Image Denoising
- Authors: Huy Vu, Gene Cheung and Yonina C. Eldar
- Abstract summary: In this paper, we combine classical graph signal filtering with deep feature learning into a competitive hybrid design.
We employ interpretable analytical low-pass graph filters and employ 80% fewer network parameters than state-of-the-art DL denoising scheme DnCNN.
- Score: 106.93258903150702
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: While deep learning (DL) architectures like convolutional neural networks
(CNNs) have enabled effective solutions in image denoising, in general their
implementations overly rely on training data, lack interpretability, and
require tuning of a large parameter set. In this paper, we combine classical
graph signal filtering with deep feature learning into a competitive hybrid
design -- one that utilizes interpretable analytical low-pass graph filters and
employs 80% fewer network parameters than state-of-the-art DL denoising scheme
DnCNN. Specifically, to construct a suitable similarity graph for graph
spectral filtering, we first adopt a CNN to learn feature representations per
pixel, and then compute feature distances to establish edge weights. Given a
constructed graph, we next formulate a convex optimization problem for
denoising using a graph total variation (GTV) prior. Via a $l_1$ graph
Laplacian reformulation, we interpret its solution in an iterative procedure as
a graph low-pass filter and derive its frequency response. For fast filter
implementation, we realize this response using a Lanczos approximation.
Experimental results show that in the case of statistical mistmatch, our
algorithm outperformed DnCNN by up to 3dB in PSNR.
Related papers
- Spectral Greedy Coresets for Graph Neural Networks [61.24300262316091]
The ubiquity of large-scale graphs in node-classification tasks hinders the real-world applications of Graph Neural Networks (GNNs)
This paper studies graph coresets for GNNs and avoids the interdependence issue by selecting ego-graphs based on their spectral embeddings.
Our spectral greedy graph coreset (SGGC) scales to graphs with millions of nodes, obviates the need for model pre-training, and applies to low-homophily graphs.
arXiv Detail & Related papers (2024-05-27T17:52:12Z) - Robust Graph Neural Network based on Graph Denoising [10.564653734218755]
Graph Neural Networks (GNNs) have emerged as a notorious alternative to address learning problems dealing with non-Euclidean datasets.
This work proposes a robust implementation of GNNs that explicitly accounts for the presence of perturbations in the observed topology.
arXiv Detail & Related papers (2023-12-11T17:43:57Z) - Learning Stable Graph Neural Networks via Spectral Regularization [18.32587282139282]
Stability of graph neural networks (GNNs) characterizes how GNNs react to graph perturbations and provides guarantees for architecture performance in noisy scenarios.
This paper develops a self-regularized graph neural network (SR-GNN) that improves the architecture stability by regularizing the filter frequency responses in the graph spectral domain.
arXiv Detail & Related papers (2022-11-13T17:27:21Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Adaptive Kernel Graph Neural Network [21.863238974404474]
Graph neural networks (GNNs) have demonstrated great success in representation learning for graph-structured data.
In this paper, we propose a novel framework - i.e., namely Adaptive Kernel Graph Neural Network (AKGNN)
AKGNN learns to adapt to the optimal graph kernel in a unified manner at the first attempt.
Experiments are conducted on acknowledged benchmark datasets and promising results demonstrate the outstanding performance of our proposed AKGNN.
arXiv Detail & Related papers (2021-12-08T20:23:58Z) - Scaling Up Graph Neural Networks Via Graph Coarsening [18.176326897605225]
Scalability of graph neural networks (GNNs) is one of the major challenges in machine learning.
In this paper, we propose to use graph coarsening for scalable training of GNNs.
We show that, simply applying off-the-shelf coarsening methods, we can reduce the number of nodes by up to a factor of ten without causing a noticeable downgrade in classification accuracy.
arXiv Detail & Related papers (2021-06-09T15:46:17Z) - Increase and Conquer: Training Graph Neural Networks on Growing Graphs [116.03137405192356]
We consider the problem of learning a graphon neural network (WNN) by training GNNs on graphs sampled Bernoulli from the graphon.
Inspired by these results, we propose an algorithm to learn GNNs on large-scale graphs that, starting from a moderate number of nodes, successively increases the size of the graph during training.
arXiv Detail & Related papers (2021-06-07T15:05:59Z) - Graph Neural Networks with Adaptive Frequency Response Filter [55.626174910206046]
We develop a graph neural network framework AdaGNN with a well-smooth adaptive frequency response filter.
We empirically validate the effectiveness of the proposed framework on various benchmark datasets.
arXiv Detail & Related papers (2021-04-26T19:31:21Z) - Graphon Pooling in Graph Neural Networks [169.09536309161314]
Graph neural networks (GNNs) have been used effectively in different applications involving the processing of signals on irregular structures modeled by graphs.
We propose a new strategy for pooling and sampling on GNNs using graphons which preserves the spectral properties of the graph.
arXiv Detail & Related papers (2020-03-03T21:04:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.