ACMP: Allen-Cahn Message Passing for Graph Neural Networks with Particle
Phase Transition
- URL: http://arxiv.org/abs/2206.05437v3
- Date: Mon, 24 Apr 2023 01:52:14 GMT
- Title: ACMP: Allen-Cahn Message Passing for Graph Neural Networks with Particle
Phase Transition
- Authors: Yuelin Wang, Kai Yi, Xinliang Liu, Yu Guang Wang, Shi Jin
- Abstract summary: We introduce Allen-Cahn message passing (ACMP) for graph neural networks.
The dynamics of the system is a reaction-diffusion process which can separate particles without blowing up.
It provides a model of GNNs circumventing the common GNN problem of oversmoothing messages.
- Score: 24.59894322312533
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural message passing is a basic feature extraction unit for
graph-structured data considering neighboring node features in network
propagation from one layer to the next. We model such process by an interacting
particle system with attractive and repulsive forces and the Allen-Cahn force
arising in the modeling of phase transition. The dynamics of the system is a
reaction-diffusion process which can separate particles without blowing up.
This induces an Allen-Cahn message passing (ACMP) for graph neural networks
where the numerical iteration for the particle system solution constitutes the
message passing propagation. ACMP which has a simple implementation with a
neural ODE solver can propel the network depth up to one hundred of layers with
theoretically proven strictly positive lower bound of the Dirichlet energy. It
thus provides a deep model of GNNs circumventing the common GNN problem of
oversmoothing. GNNs with ACMP achieve state of the art performance for
real-world node classification tasks on both homophilic and heterophilic
datasets.
Related papers
- Bundle Neural Networks for message diffusion on graphs [10.018379001231356]
We show that Bundle Neural Networks (BuNNs) can approximate any feature transformation over nodes on any graphs given injective positional encodings.
We also prove that BuNNs can approximate any feature transformation over nodes on any family of graphs given injective positional encodings, resulting in universal node-level expressivity.
arXiv Detail & Related papers (2024-05-24T13:28:48Z) - Neighborhood Convolutional Network: A New Paradigm of Graph Neural
Networks for Node Classification [12.062421384484812]
Graph Convolutional Network (GCN) decouples neighborhood aggregation and feature transformation in each convolutional layer.
In this paper, we propose a new paradigm of GCN, termed Neighborhood Convolutional Network (NCN)
In this way, the model could inherit the merit of decoupled GCN for aggregating neighborhood information, at the same time, develop much more powerful feature learning modules.
arXiv Detail & Related papers (2022-11-15T02:02:51Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Transformer with Implicit Edges for Particle-based Physics Simulation [135.77656965678196]
Transformer with Implicit Edges (TIE) captures the rich semantics of particle interactions in an edge-free manner.
We evaluate our model on diverse domains of varying complexity and materials.
arXiv Detail & Related papers (2022-07-22T03:45:29Z) - Deep Architecture Connectivity Matters for Its Convergence: A
Fine-Grained Analysis [94.64007376939735]
We theoretically characterize the impact of connectivity patterns on the convergence of deep neural networks (DNNs) under gradient descent training.
We show that by a simple filtration on "unpromising" connectivity patterns, we can trim down the number of models to evaluate.
arXiv Detail & Related papers (2022-05-11T17:43:54Z) - Spiking Graph Convolutional Networks [19.36064180392385]
SpikingGCN is an end-to-end framework that aims to integrate the embedding of GCNs with the biofidelity characteristics of SNNs.
We show that SpikingGCN on a neuromorphic chip can bring a clear advantage of energy efficiency into graph data analysis.
arXiv Detail & Related papers (2022-05-05T16:44:36Z) - Resonant tunnelling diode nano-optoelectronic spiking nodes for
neuromorphic information processing [0.0]
We introduce an optoelectronic artificial neuron capable of operating at ultrafast rates and with low energy consumption.
The proposed system combines an excitable tunnelling diode (RTD) element with a nanoscale light source.
arXiv Detail & Related papers (2021-07-14T14:11:04Z) - A tensor network representation of path integrals: Implementation and
analysis [0.0]
We introduce a novel tensor network-based decomposition of path integral simulations involving Feynman-Vernon influence functional.
The finite temporarily non-local interactions introduced by the influence functional can be captured very efficiently using matrix product state representation.
The flexibility of the AP-TNPI framework makes it a promising new addition to the family of path integral methods for non-equilibrium quantum dynamics.
arXiv Detail & Related papers (2021-06-23T16:41:54Z) - Spatio-Temporal Inception Graph Convolutional Networks for
Skeleton-Based Action Recognition [126.51241919472356]
We design a simple and highly modularized graph convolutional network architecture for skeleton-based action recognition.
Our network is constructed by repeating a building block that aggregates multi-granularity information from both the spatial and temporal paths.
arXiv Detail & Related papers (2020-11-26T14:43:04Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.