Graph Neural Network as Computationally Efficient Emulator of Ice-sheet and Sea-level System Model (ISSM)
- URL: http://arxiv.org/abs/2407.01464v1
- Date: Wed, 26 Jun 2024 16:13:11 GMT
- Title: Graph Neural Network as Computationally Efficient Emulator of Ice-sheet and Sea-level System Model (ISSM)
- Authors: Younghyun Koo, Maryam Rahnemoonfar,
- Abstract summary: We design a graph convolutional network (GCN) as a fast emulator for the Ice-sheet and Sea-level System Model (ISSM)
GCN shows 34 times faster computational speed than the CPU-based ISSM modeling.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The Ice-sheet and Sea-level System Model (ISSM) provides solutions for Stokes equations relevant to ice sheet dynamics by employing finite element and fine mesh adaption. However, since its finite element method is compatible only with Central Processing Units (CPU), the ISSM has limits on further economizing computational time. Thus, by taking advantage of Graphics Processing Units (GPUs), we design a graph convolutional network (GCN) as a fast emulator for ISSM. The GCN is trained and tested using the 20-year transient ISSM simulations in the Pine Island Glacier (PIG). The GCN reproduces ice thickness and velocity with a correlation coefficient greater than 0.998, outperforming the traditional convolutional neural network (CNN). Additionally, GCN shows 34 times faster computational speed than the CPU-based ISSM modeling. The GPU-based GCN emulator allows us to predict how the PIG will change in the future under different melting rate scenarios with high fidelity and much faster computational time.
Related papers
- Scalable Mechanistic Neural Networks [52.28945097811129]
We propose an enhanced neural network framework designed for scientific machine learning applications involving long temporal sequences.
By reformulating the original Mechanistic Neural Network (MNN) we reduce the computational time and space complexities from cubic and quadratic with respect to the sequence length, respectively, to linear.
Extensive experiments demonstrate that S-MNN matches the original MNN in precision while substantially reducing computational resources.
arXiv Detail & Related papers (2024-10-08T14:27:28Z) - Up-sampling-only and Adaptive Mesh-based GNN for Simulating Physical Systems [7.384641647468888]
We develop a novel hierarchical Mesh Graph Network, namely UA-MGN, for efficient and effective mechanical simulation.
Evaluation on two synthetic and one real datasets demonstrates the superiority of the UA-MGN.
arXiv Detail & Related papers (2024-09-07T07:09:58Z) - Graph Neural Networks for Emulation of Finite-Element Ice Dynamics in Greenland and Antarctic Ice Sheets [0.0]
equivariant graph convolutional network (EGCN) is an emulator for the ice sheet dynamics modeling.
EGCN reproduces ice thickness and velocity changes in the Helheim Glacier, Greenland, and Pine Island Glacier, Antarctica, with 260 times and 44 times faster computation time, respectively.
arXiv Detail & Related papers (2024-06-26T15:18:49Z) - Uncertainty-enabled machine learning for emulation of regional sea-level change caused by the Antarctic Ice Sheet [0.8130739369606821]
We build neural-network emulators of sea-level change at 27 coastal locations.
We show that the neural-network emulators have an accuracy that is competitive with baseline machine learning emulators.
arXiv Detail & Related papers (2024-06-21T18:27:09Z) - Graph Neural Networks as Fast and High-fidelity Emulators for
Finite-Element Ice Sheet Modeling [0.0]
We develop graph neural networks (GNNs) as fast surrogate models to preserve the finite element structure of the Ice-sheet and Sea-level System Model (ISSM)
GNNs reproduce ice thickness and velocity with better accuracy than the classic convolutional neural network (CNN) and multi-layer perception (MLP)
arXiv Detail & Related papers (2024-02-07T22:10:36Z) - Deep convolutional surrogates and degrees of freedom in thermal design [0.0]
Convolutional Neural Networks (CNNs) are used to predict results of Computational Fluid Dynamics (CFD) directly from topologies saved as images.
We present surrogate models for heat transfer and pressure drop prediction of complex fin geometries generated using composite Bezier curves.
arXiv Detail & Related papers (2022-08-16T00:45:39Z) - GLEAM: Greedy Learning for Large-Scale Accelerated MRI Reconstruction [50.248694764703714]
Unrolled neural networks have recently achieved state-of-the-art accelerated MRI reconstruction.
These networks unroll iterative optimization algorithms by alternating between physics-based consistency and neural-network based regularization.
We propose Greedy LEarning for Accelerated MRI reconstruction, an efficient training strategy for high-dimensional imaging settings.
arXiv Detail & Related papers (2022-07-18T06:01:29Z) - Learning Large-scale Subsurface Simulations with a Hybrid Graph Network
Simulator [57.57321628587564]
We introduce Hybrid Graph Network Simulator (HGNS) for learning reservoir simulations of 3D subsurface fluid flows.
HGNS consists of a subsurface graph neural network (SGNN) to model the evolution of fluid flows, and a 3D-U-Net to model the evolution of pressure.
Using an industry-standard subsurface flow dataset (SPE-10) with 1.1 million cells, we demonstrate that HGNS is able to reduce the inference time up to 18 times compared to standard subsurface simulators.
arXiv Detail & Related papers (2022-06-15T17:29:57Z) - Temporal Attention-Augmented Graph Convolutional Network for Efficient
Skeleton-Based Human Action Recognition [97.14064057840089]
Graphal networks (GCNs) have been very successful in modeling non-Euclidean data structures.
Most GCN-based action recognition methods use deep feed-forward networks with high computational complexity to process all skeletons in an action.
We propose a temporal attention module (TAM) for increasing the efficiency in skeleton-based action recognition.
arXiv Detail & Related papers (2020-10-23T08:01:55Z) - Combining Differentiable PDE Solvers and Graph Neural Networks for Fluid
Flow Prediction [79.81193813215872]
We develop a hybrid (graph) neural network that combines a traditional graph convolutional network with an embedded differentiable fluid dynamics simulator inside the network itself.
We show that we can both generalize well to new situations and benefit from the substantial speedup of neural network CFD predictions.
arXiv Detail & Related papers (2020-07-08T21:23:19Z) - L$^2$-GCN: Layer-Wise and Learned Efficient Training of Graph
Convolutional Networks [118.37805042816784]
Graph convolution networks (GCN) are increasingly popular in many applications, yet remain notoriously hard to train over large graph datasets.
We propose a novel efficient layer-wise training framework for GCN (L-GCN), that disentangles feature aggregation and feature transformation during training.
Experiments show that L-GCN is faster than state-of-the-arts by at least an order of magnitude, with a consistent of memory usage not dependent on dataset size.
arXiv Detail & Related papers (2020-03-30T16:37:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.