How Framelets Enhance Graph Neural Networks
- URL: http://arxiv.org/abs/2102.06986v1
- Date: Sat, 13 Feb 2021 19:19:19 GMT
- Title: How Framelets Enhance Graph Neural Networks
- Authors: Xuebin Zheng, Bingxin Zhou, Junbin Gao, Yu Guang Wang, Pietro Lio,
Ming Li, Guido Montufar
- Abstract summary: This paper presents a new approach for assembling graph neural networks based on framelet transforms.
We propose shrinkage as a new activation for the framelet convolution, which thresholds the high-frequency information at different scales.
- Score: 27.540282741523253
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper presents a new approach for assembling graph neural networks based
on framelet transforms. The latter provides a multi-scale representation for
graph-structured data. With the framelet system, we can decompose the graph
feature into low-pass and high-pass frequencies as extracted features for
network training, which then defines a framelet-based graph convolution. The
framelet decomposition naturally induces a graph pooling strategy by
aggregating the graph feature into low-pass and high-pass spectra, which
considers both the feature values and geometry of the graph data and conserves
the total information. The graph neural networks with the proposed framelet
convolution and pooling achieve state-of-the-art performance in many types of
node and graph prediction tasks. Moreover, we propose shrinkage as a new
activation for the framelet convolution, which thresholds the high-frequency
information at different scales. Compared to ReLU, shrinkage in framelet
convolution improves the graph neural network model in terms of denoising and
signal compression: noises in both node and structure can be significantly
reduced by accurately cutting off the high-pass coefficients from framelet
decomposition, and the signal can be compressed to less than half its original
size with the prediction performance well preserved.
Related papers
- Graph Transformer GANs with Graph Masked Modeling for Architectural
Layout Generation [153.92387500677023]
We present a novel graph Transformer generative adversarial network (GTGAN) to learn effective graph node relations.
The proposed graph Transformer encoder combines graph convolutions and self-attentions in a Transformer to model both local and global interactions.
We also propose a novel self-guided pre-training method for graph representation learning.
arXiv Detail & Related papers (2024-01-15T14:36:38Z) - Data-Adaptive Graph Framelets with Generalized Vanishing Moments for
Graph Signal Processing [2.039632659682125]
We propose a framework to construct tight framelet systems on graphs with localized supports based on hierarchical partitions.
Our construction provides parametrized graph framelet systems with great generality based on partition trees.
We show that our learned graph framelet systems perform superiorly in non-linear approximation and denoising tasks.
arXiv Detail & Related papers (2023-09-07T07:49:43Z) - Structure-free Graph Condensation: From Large-scale Graphs to Condensed
Graph-free Data [91.27527985415007]
Existing graph condensation methods rely on the joint optimization of nodes and structures in the condensed graph.
We advocate a new Structure-Free Graph Condensation paradigm, named SFGC, to distill a large-scale graph into a small-scale graph node set.
arXiv Detail & Related papers (2023-06-05T07:53:52Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Quasi-Framelets: Robust Graph Neural Networks via Adaptive Framelet Convolution [28.474359021962346]
We propose a multiscale framelet convolution for spectral graph neural networks (GNNs)
The proposed design excels in filtering out unwanted spectral information and significantly reduces the adverse effects of noisy graph signals.
It exhibits remarkable resilience to noisy data and adversarial attacks, highlighting its potential as a robust solution for real-world graph applications.
arXiv Detail & Related papers (2022-01-11T00:10:28Z) - Spectral Graph Convolutional Networks With Lifting-based Adaptive Graph
Wavelets [81.63035727821145]
Spectral graph convolutional networks (SGCNs) have been attracting increasing attention in graph representation learning.
We propose a novel class of spectral graph convolutional networks that implement graph convolutions with adaptive graph wavelets.
arXiv Detail & Related papers (2021-08-03T17:57:53Z) - Decimated Framelet System on Graphs and Fast G-Framelet Transforms [3.7277730514654555]
An adequate representation of graph data is vital to the learning performance of a statistical or machine learning model for graph-structured data.
We propose a novel multiscale representation system for graph data, called decimated framelets, which form a localized tight frame on the graph.
The effectiveness is demonstrated by real-world applications, including multiresolution analysis for traffic network, and graph neural networks for graph classification tasks.
arXiv Detail & Related papers (2020-12-12T23:57:17Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Representation Learning of Graphs Using Graph Convolutional Multilayer
Networks Based on Motifs [17.823543937167848]
mGCMN is a novel framework which utilizes node feature information and the higher order local structure of the graph.
It will greatly improve the learning efficiency of the graph neural network and promote a brand-new learning mode establishment.
arXiv Detail & Related papers (2020-07-31T04:18:20Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.