Revisiting convolutional neural network on graphs with polynomial
approximations of Laplace-Beltrami spectral filtering
- URL: http://arxiv.org/abs/2010.13269v1
- Date: Mon, 26 Oct 2020 01:18:05 GMT
- Title: Revisiting convolutional neural network on graphs with polynomial
approximations of Laplace-Beltrami spectral filtering
- Authors: Shih-Gu Huang, Moo K. Chung, Anqi Qiu, Alzheimer's Disease
Neuroimaging Initiative
- Abstract summary: This paper revisits spectral graphal neural networks (graph-CNNs) given in Defferrard.
We develop the Laplace-Beltrami CNN (LBCNN) by replacing the graph Laplacian with the LB operator.
- Score: 6.111909222842263
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper revisits spectral graph convolutional neural networks (graph-CNNs)
given in Defferrard (2016) and develops the Laplace-Beltrami CNN (LB-CNN) by
replacing the graph Laplacian with the LB operator. We then define spectral
filters via the LB operator on a graph. We explore the feasibility of
Chebyshev, Laguerre, and Hermite polynomials to approximate LB-based spectral
filters and define an update of the LB operator for pooling in the LBCNN. We
employ the brain image data from Alzheimer's Disease Neuroimaging Initiative
(ADNI) and demonstrate the use of the proposed LB-CNN. Based on the cortical
thickness of the ADNI dataset, we showed that the LB-CNN didn't improve
classification accuracy compared to the spectral graph-CNN. The three
polynomials had a similar computational cost and showed comparable
classification accuracy in the LB-CNN or spectral graph-CNN. Our findings
suggest that even though the shapes of the three polynomials are different,
deep learning architecture allows us to learn spectral filters such that the
classification performance is not dependent on the type of the polynomials or
the operators (graph Laplacian and LB operator).
Related papers
- Large-Scale Spectral Graph Neural Networks via Laplacian Sparsification: Technical Report [21.288230563135055]
We propose a novel graph spectral sparsification method to approximate the propagation patterns of spectral Graph Neural Networks (GNNs)
Our method allows the application of linear layers on the input node features, enabling end-to-end training as well as the handling of raw features.
arXiv Detail & Related papers (2025-01-08T15:36:19Z) - GrassNet: State Space Model Meets Graph Neural Network [57.62885438406724]
Graph State Space Network (GrassNet) is a novel graph neural network with theoretical support that provides a simple yet effective scheme for designing arbitrary graph spectral filters.
To the best of our knowledge, our work is the first to employ SSMs for the design of graph GNN spectral filters.
Extensive experiments on nine public benchmarks reveal that GrassNet achieves superior performance in real-world graph modeling tasks.
arXiv Detail & Related papers (2024-08-16T07:33:58Z) - Polynomial Selection in Spectral Graph Neural Networks: An Error-Sum of Function Slices Approach [26.79625547648669]
Spectral graph networks are proposed to harness spectral information inherent in graph neural data through the application of graph filters.
We show that various choices greatly impact spectral GNN performance, underscoring the importance of parameter selection.
We develop an advanced filter based on trigonometrics, a widely adopted option for approxing narrow signal slices.
arXiv Detail & Related papers (2024-04-15T11:35:32Z) - Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - TetCNN: Convolutional Neural Networks on Tetrahedral Meshes [2.952111139469156]
Convolutional neural networks (CNN) have been broadly studied on images, videos, graphs, and triangular meshes.
We introduce a novel interpretable graph CNN framework for the tetrahedral mesh structure.
Inspired by ChebyNet, our model exploits the volumetric Laplace-Beltrami Operator (LBO) to define filters over commonly used graph Laplacian.
arXiv Detail & Related papers (2023-02-08T01:52:48Z) - Learnable Filters for Geometric Scattering Modules [64.03877398967282]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2022-08-15T22:30:07Z) - Graph Neural Networks with Adaptive Frequency Response Filter [55.626174910206046]
We develop a graph neural network framework AdaGNN with a well-smooth adaptive frequency response filter.
We empirically validate the effectiveness of the proposed framework on various benchmark datasets.
arXiv Detail & Related papers (2021-04-26T19:31:21Z) - Unrolling of Deep Graph Total Variation for Image Denoising [106.93258903150702]
In this paper, we combine classical graph signal filtering with deep feature learning into a competitive hybrid design.
We employ interpretable analytical low-pass graph filters and employ 80% fewer network parameters than state-of-the-art DL denoising scheme DnCNN.
arXiv Detail & Related papers (2020-10-21T20:04:22Z) - Fast Mesh Data Augmentation via Chebyshev Polynomial of Spectral
filtering [5.594792814661452]
Deep neural networks have been recognized as one of the powerful learning techniques in computer vision and medical image analysis.
In practice, there is often insufficient training data available and augmentation is used to expand the dataset.
This study proposes two unbiased augmentation methods, Laplace-Beltrami eigenfunction Data Augmentation (LB-eigDA) and Chebyshev Data Augmentation (C-pDA)
arXiv Detail & Related papers (2020-10-06T15:18:26Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Framework for Designing Filters of Spectral Graph Convolutional Neural
Networks in the Context of Regularization Theory [1.0152838128195467]
Graph convolutional neural networks (GCNNs) have been widely used in graph learning.
It has been observed that the smoothness functional on graphs can be defined in terms of the graph Laplacian.
In this work, we explore the regularization properties of graph Laplacian and proposed a generalized framework for regularized filter designs in spectral GCNNs.
arXiv Detail & Related papers (2020-09-29T06:19:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.