MG-GNN: Multigrid Graph Neural Networks for Learning Multilevel Domain
Decomposition Methods
- URL: http://arxiv.org/abs/2301.11378v1
- Date: Thu, 26 Jan 2023 19:44:45 GMT
- Title: MG-GNN: Multigrid Graph Neural Networks for Learning Multilevel Domain
Decomposition Methods
- Authors: Ali Taghibakhshi, Nicolas Nytko, Tareq Uz Zaman, Scott MacLachlan,
Luke Olson, Matthew West
- Abstract summary: We propose multigrid graph neural networks (MG-GNN) for learning optimized parameters in two-level domain decomposition methods.
We show that MG-GNN outperforms popular hierarchical graph network architectures for this optimization.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Domain decomposition methods (DDMs) are popular solvers for discretized
systems of partial differential equations (PDEs), with one-level and multilevel
variants. These solvers rely on several algorithmic and mathematical
parameters, prescribing overlap, subdomain boundary conditions, and other
properties of the DDM. While some work has been done on optimizing these
parameters, it has mostly focused on the one-level setting or special cases
such as structured-grid discretizations with regular subdomain construction. In
this paper, we propose multigrid graph neural networks (MG-GNN), a novel GNN
architecture for learning optimized parameters in two-level DDMs\@. We train
MG-GNN using a new unsupervised loss function, enabling effective training on
small problems that yields robust performance on unstructured grids that are
orders of magnitude larger than those in the training set. We show that MG-GNN
outperforms popular hierarchical graph network architectures for this
optimization and that our proposed loss function is critical to achieving this
improved performance.
Related papers
- Enhancing GNNs Performance on Combinatorial Optimization by Recurrent Feature Update [0.09986418756990156]
We introduce a novel algorithm, denoted hereafter as QRF-GNN, leveraging the power of GNNs to efficiently solve Combinatorial optimization (CO) problems.
It relies on unsupervised learning by minimizing the loss function derived from QUBO relaxation.
Results of experiments show that QRF-GNN drastically surpasses existing learning-based approaches and is comparable to the state-of-the-art conventionals.
arXiv Detail & Related papers (2024-07-23T13:34:35Z) - Degree-based stratification of nodes in Graph Neural Networks [66.17149106033126]
We modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group.
This simple-to-implement modification seems to improve performance across datasets and GNN methods.
arXiv Detail & Related papers (2023-12-16T14:09:23Z) - T-GAE: Transferable Graph Autoencoder for Network Alignment [79.89704126746204]
T-GAE is a graph autoencoder framework that leverages transferability and stability of GNNs to achieve efficient network alignment without retraining.
Our experiments demonstrate that T-GAE outperforms the state-of-the-art optimization method and the best GNN approach by up to 38.7% and 50.8%, respectively.
arXiv Detail & Related papers (2023-10-05T02:58:29Z) - AGNN: Alternating Graph-Regularized Neural Networks to Alleviate
Over-Smoothing [29.618952407794776]
We propose an Alternating Graph-regularized Neural Network (AGNN) composed of Graph Convolutional Layer (GCL) and Graph Embedding Layer (GEL)
GEL is derived from the graph-regularized optimization containing Laplacian embedding term, which can alleviate the over-smoothing problem.
AGNN is evaluated via a large number of experiments including performance comparison with some multi-layer or multi-order graph neural networks.
arXiv Detail & Related papers (2023-04-14T09:20:03Z) - Agglomeration of Polygonal Grids using Graph Neural Networks with
applications to Multigrid solvers [0.0]
We propose the use of Graph Neural Networks (GNNs) to partition the connectivity graph of a computational mesh.
GNNs have the advantage to process naturally and simultaneously both the graph structure of mesh and the geometrical information.
Performance in terms of quality metrics is enhanced for Machine Learning (ML) strategies, with GNNs featuring a lower computational cost online.
arXiv Detail & Related papers (2022-10-31T16:30:48Z) - MGNNI: Multiscale Graph Neural Networks with Implicit Layers [53.75421430520501]
implicit graph neural networks (GNNs) have been proposed to capture long-range dependencies in underlying graphs.
We introduce and justify two weaknesses of implicit GNNs: the constrained expressiveness due to their limited effective range for capturing long-range dependencies, and their lack of ability to capture multiscale information on graphs at multiple resolutions.
We propose a multiscale graph neural network with implicit layers (MGNNI) which is able to model multiscale structures on graphs and has an expanded effective range for capturing long-range dependencies.
arXiv Detail & Related papers (2022-10-15T18:18:55Z) - ASGNN: Graph Neural Networks with Adaptive Structure [41.83813812538167]
We propose a novel interpretable message passing scheme with adaptive structure (ASMP) to defend against adversarial attacks on graph structure.
ASMP is adaptive in the sense that the message passing process in different layers is able to be carried out over dynamically adjusted graphs.
arXiv Detail & Related papers (2022-10-03T15:10:40Z) - A Bipartite Graph Neural Network Approach for Scalable Beamforming
Optimization [19.747638780327257]
Deep learning (DL) techniques have been intensively studied for the optimization of multi-user single-input single-output (MU-MISO) systems.
This paper develops a framework for beamforming networks with respect to antennas i.e., the number of users.
arXiv Detail & Related papers (2022-07-12T07:59:21Z) - Contrastive Adaptive Propagation Graph Neural Networks for Efficient
Graph Learning [65.08818785032719]
Graph Networks (GNNs) have achieved great success in processing graph data by extracting and propagating structure-aware features.
Recently the field has advanced from local propagation schemes that focus on local neighbors towards extended propagation schemes that can directly deal with extended neighbors consisting of both local and high-order neighbors.
Despite the impressive performance, existing approaches are still insufficient to build an efficient and learnable extended propagation scheme that can adaptively adjust the influence of local and high-order neighbors.
arXiv Detail & Related papers (2021-12-02T10:35:33Z) - Policy-GNN: Aggregation Optimization for Graph Neural Networks [60.50932472042379]
Graph neural networks (GNNs) aim to model the local graph structures and capture the hierarchical patterns by aggregating the information from neighbors.
It is a challenging task to develop an effective aggregation strategy for each node, given complex graphs and sparse features.
We propose Policy-GNN, a meta-policy framework that models the sampling procedure and message passing of GNNs into a combined learning process.
arXiv Detail & Related papers (2020-06-26T17:03:06Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.