GPINN: Physics-informed Neural Network with Graph Embedding
- URL: http://arxiv.org/abs/2306.09792v1
- Date: Fri, 16 Jun 2023 12:03:39 GMT
- Title: GPINN: Physics-informed Neural Network with Graph Embedding
- Authors: Yuyang Miao, Haolin Li
- Abstract summary: This work proposes a Physics-informed Neural Network framework with Graph Embedding (GPINN) to perform PINN in graph.
The method integrates topological data into the neural network's computations, which significantly boosts the performance of the Physics-Informed Neural Network (PINN)
- Score: 1.6607142366834016
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This work proposes a Physics-informed Neural Network framework with Graph
Embedding (GPINN) to perform PINN in graph, i.e. topological space instead of
traditional Euclidean space, for improved problem-solving efficiency. The
method integrates topological data into the neural network's computations,
which significantly boosts the performance of the Physics-Informed Neural
Network (PINN). The graph embedding technique infuses extra dimensions into the
input space to encapsulate the spatial characteristics of a graph while
preserving the properties of the original space. The selection of these extra
dimensions is guided by the Fiedler vector, offering an optimised pathologic
notation of the graph. Two case studies are conducted, which demonstrate
significant improvement in the performance of GPINN in comparison to
traditional PINN, particularly in its superior ability to capture physical
features of the solution.
Related papers
- Spatiotemporal Learning on Cell-embedded Graphs [6.8090864965073274]
We introduce a learnable cell attribution to the node-edge message passing process, which better captures the spatial dependency of regional features.
Experiments on various PDE systems and one real-world dataset demonstrate that CeGNN achieves superior performance compared with other baseline models.
arXiv Detail & Related papers (2024-09-26T16:22:08Z) - DepWiGNN: A Depth-wise Graph Neural Network for Multi-hop Spatial
Reasoning in Text [52.699307699505646]
We propose a novel Depth-Wise Graph Neural Network (DepWiGNN) to handle multi-hop spatial reasoning.
Specifically, we design a novel node memory scheme and aggregate the information over the depth dimension instead of the breadth dimension of the graph.
Experimental results on two challenging multi-hop spatial reasoning datasets show that DepWiGNN outperforms existing spatial reasoning methods.
arXiv Detail & Related papers (2023-10-19T08:07:22Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Feature Expansion for Graph Neural Networks [26.671557021142572]
We decompose graph neural networks into determined feature spaces and trainable weights.
We theoretically find that the feature space tends to be linearly correlated due to repeated aggregations.
Motivated by these findings, we propose 1) feature subspaces flattening and 2) structural principal components to expand the feature space.
arXiv Detail & Related papers (2023-05-10T13:45:57Z) - Edge-Level Explanations for Graph Neural Networks by Extending
Explainability Methods for Convolutional Neural Networks [33.20913249848369]
Graph Neural Networks (GNNs) are deep learning models that take graph data as inputs, and they are applied to various tasks such as traffic prediction and molecular property prediction.
We extend explainability methods for CNNs, such as Local Interpretable Model-Agnostic Explanations (LIME), Gradient-Based Saliency Maps, and Gradient-Weighted Class Activation Mapping (Grad-CAM) to GNNs.
The experimental results indicate that the LIME-based approach is the most efficient explainability method for multiple tasks in the real-world situation, outperforming even the state-of-the
arXiv Detail & Related papers (2021-11-01T06:27:29Z) - Improving Graph Neural Networks with Simple Architecture Design [7.057970273958933]
We introduce several key design strategies for graph neural networks.
We present a simple and shallow model, Feature Selection Graph Neural Network (FSGNN)
We show that the proposed model outperforms other state of the art GNN models and achieves up to 64% improvements in accuracy on node classification tasks.
arXiv Detail & Related papers (2021-05-17T06:46:01Z) - Graph Feature Gating Networks [31.20878472589719]
We propose a general graph feature gating network (GFGN) based on the graph signal denoising problem.
We also introduce three graph filters under GFGN to allow different levels of contributions from feature dimensions.
arXiv Detail & Related papers (2021-05-10T16:33:58Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Graph Neural Networks for Motion Planning [108.51253840181677]
We present two techniques, GNNs over dense fixed graphs for low-dimensional problems and sampling-based GNNs for high-dimensional problems.
We examine the ability of a GNN to tackle planning problems such as identifying critical nodes or learning the sampling distribution in Rapidly-exploring Random Trees (RRT)
Experiments with critical sampling, a pendulum and a six DoF robot arm show GNNs improve on traditional analytic methods as well as learning approaches using fully-connected or convolutional neural networks.
arXiv Detail & Related papers (2020-06-11T08:19:06Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.