Dirichlet Energy Enhancement of Graph Neural Networks by Framelet
Augmentation
- URL: http://arxiv.org/abs/2311.05767v1
- Date: Thu, 9 Nov 2023 22:22:18 GMT
- Title: Dirichlet Energy Enhancement of Graph Neural Networks by Framelet
Augmentation
- Authors: Jialin Chen, Yuelin Wang, Cristian Bodnar, Rex Ying, Pietro Lio, Yu
Guang Wang
- Abstract summary: We introduce a framelet system into the analysis of Dirichlet energy and take a multi-scale perspective to leverage the Dirichlet energy.
Based on that, we design the Energy Enhanced Convolution (EEConv), which is an effective and practical operation.
Experiments show that deep GNNs with EEConv achieve state-of-the-art performance over various node classification datasets.
- Score: 19.56268823452656
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph convolutions have been a pivotal element in learning graph
representations. However, recursively aggregating neighboring information with
graph convolutions leads to indistinguishable node features in deep layers,
which is known as the over-smoothing issue. The performance of graph neural
networks decays fast as the number of stacked layers increases, and the
Dirichlet energy associated with the graph decreases to zero as well. In this
work, we introduce a framelet system into the analysis of Dirichlet energy and
take a multi-scale perspective to leverage the Dirichlet energy and alleviate
the over-smoothing issue. Specifically, we develop a Framelet Augmentation
strategy by adjusting the update rules with positive and negative increments
for low-pass and high-passes respectively. Based on that, we design the Energy
Enhanced Convolution (EEConv), which is an effective and practical operation
that is proved to strictly enhance Dirichlet energy. From a message-passing
perspective, EEConv inherits multi-hop aggregation property from the framelet
transform and takes into account all hops in the multi-scale representation,
which benefits the node classification tasks over heterophilous graphs.
Experiments show that deep GNNs with EEConv achieve state-of-the-art
performance over various node classification datasets, especially for
heterophilous graphs, while also lifting the Dirichlet energy as the network
goes deeper.
Related papers
- SGFormer: Single-Layer Graph Transformers with Approximation-Free Linear Complexity [74.51827323742506]
We evaluate the necessity of adopting multi-layer attentions in Transformers on graphs.
We show that one-layer propagation can be reduced to one-layer propagation, with the same capability for representation learning.
It suggests a new technical path for building powerful and efficient Transformers on graphs.
arXiv Detail & Related papers (2024-09-13T17:37:34Z) - A Theoretical Formulation of Many-body Message Passing Neural Networks [0.0]
We present many-body Message Passing Neural Network (MPNN) framework that models higher-order node interactions.
We apply localized spectral filters on motif Laplacian, weighted by global edge Ricci curvatures.
We prove our formulation is invariant to neighbor node permutation, derive its sensitivity bound, and bound the range of learned graph potential.
arXiv Detail & Related papers (2024-07-16T14:18:48Z) - Self-Attention Empowered Graph Convolutional Network for Structure
Learning and Node Embedding [5.164875580197953]
In representation learning on graph-structured data, many popular graph neural networks (GNNs) fail to capture long-range dependencies.
This paper proposes a novel graph learning framework called the graph convolutional network with self-attention (GCN-SA)
The proposed scheme exhibits an exceptional generalization capability in node-level representation learning.
arXiv Detail & Related papers (2024-03-06T05:00:31Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - Graph Transformer GANs with Graph Masked Modeling for Architectural
Layout Generation [153.92387500677023]
We present a novel graph Transformer generative adversarial network (GTGAN) to learn effective graph node relations.
The proposed graph Transformer encoder combines graph convolutions and self-attentions in a Transformer to model both local and global interactions.
We also propose a novel self-guided pre-training method for graph representation learning.
arXiv Detail & Related papers (2024-01-15T14:36:38Z) - A Fractional Graph Laplacian Approach to Oversmoothing [15.795926248847026]
We generalize the concept of oversmoothing from undirected to directed graphs.
We propose fractional graph Laplacian neural ODEs, which describe non-local dynamics.
Our method is more flexible with respect to the convergence of the graph's Dirichlet energy, thereby mitigating oversmoothing.
arXiv Detail & Related papers (2023-05-22T14:52:33Z) - Graph Transformer GANs for Graph-Constrained House Generation [223.739067413952]
We present a novel graph Transformer generative adversarial network (GTGAN) to learn effective graph node relations.
The GTGAN learns effective graph node relations in an end-to-end fashion for the challenging graph-constrained house generation task.
arXiv Detail & Related papers (2023-03-14T20:35:45Z) - Spectral Graph Convolutional Networks With Lifting-based Adaptive Graph
Wavelets [81.63035727821145]
Spectral graph convolutional networks (SGCNs) have been attracting increasing attention in graph representation learning.
We propose a novel class of spectral graph convolutional networks that implement graph convolutions with adaptive graph wavelets.
arXiv Detail & Related papers (2021-08-03T17:57:53Z) - Dirichlet Graph Variational Autoencoder [65.94744123832338]
We present Dirichlet Graph Variational Autoencoder (DGVAE) with graph cluster memberships as latent factors.
Motivated by the low pass characteristics in balanced graph cut, we propose a new variant of GNN named Heatts to encode the input graph into cluster memberships.
arXiv Detail & Related papers (2020-10-09T07:35:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.