Graph Neural Networks for Emulation of Finite-Element Ice Dynamics in Greenland and Antarctic Ice Sheets
- URL: http://arxiv.org/abs/2406.18423v1
- Date: Wed, 26 Jun 2024 15:18:49 GMT
- Title: Graph Neural Networks for Emulation of Finite-Element Ice Dynamics in Greenland and Antarctic Ice Sheets
- Authors: Younghyun Koo, Maryam Rahnemoonfar,
- Abstract summary: equivariant graph convolutional network (EGCN) is an emulator for the ice sheet dynamics modeling.
EGCN reproduces ice thickness and velocity changes in the Helheim Glacier, Greenland, and Pine Island Glacier, Antarctica, with 260 times and 44 times faster computation time, respectively.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Although numerical models provide accurate solutions for ice sheet dynamics based on physics laws, they accompany intensified computational demands to solve partial differential equations. In recent years, convolutional neural networks (CNNs) have been widely used as statistical emulators for those numerical models. However, since CNNs operate on regular grids, they cannot represent the refined meshes and computational efficiency of finite-element numerical models. Therefore, instead of CNNs, this study adopts an equivariant graph convolutional network (EGCN) as an emulator for the ice sheet dynamics modeling. EGCN reproduces ice thickness and velocity changes in the Helheim Glacier, Greenland, and Pine Island Glacier, Antarctica, with 260 times and 44 times faster computation time, respectively. Compared to the traditional CNN and graph convolutional network, EGCN shows outstanding accuracy in thickness prediction near fast ice streams by preserving the equivariance to the translation and rotation of graphs.
Related papers
- Graph Neural Network as Computationally Efficient Emulator of Ice-sheet and Sea-level System Model (ISSM) [0.0]
We design a graph convolutional network (GCN) as a fast emulator for the Ice-sheet and Sea-level System Model (ISSM)
GCN shows 34 times faster computational speed than the CPU-based ISSM modeling.
arXiv Detail & Related papers (2024-06-26T16:13:11Z) - Uncertainty-enabled machine learning for emulation of regional sea-level change caused by the Antarctic Ice Sheet [0.8130739369606821]
We build neural-network emulators of sea-level change at 27 coastal locations.
We show that the neural-network emulators have an accuracy that is competitive with baseline machine learning emulators.
arXiv Detail & Related papers (2024-06-21T18:27:09Z) - Learning Spatio-Temporal Patterns of Polar Ice Layers With Physics-Informed Graph Neural Network [0.7673339435080445]
We propose a physics-informed hybrid graph neural network that combines the GraphSAGE framework for graph feature learning with the long short-term memory (LSTM) structure for learning temporal changes.
We found that our network can consistently outperform the current non-inductive or non-physical model in predicting deep ice layer thickness.
arXiv Detail & Related papers (2024-06-21T16:41:02Z) - CNN2GNN: How to Bridge CNN with GNN [59.42117676779735]
We propose a novel CNN2GNN framework to unify CNN and GNN together via distillation.
The performance of distilled boosted'' two-layer GNN on Mini-ImageNet is much higher than CNN containing dozens of layers such as ResNet152.
arXiv Detail & Related papers (2024-04-23T08:19:08Z) - Graph Neural Networks as Fast and High-fidelity Emulators for
Finite-Element Ice Sheet Modeling [0.0]
We develop graph neural networks (GNNs) as fast surrogate models to preserve the finite element structure of the Ice-sheet and Sea-level System Model (ISSM)
GNNs reproduce ice thickness and velocity with better accuracy than the classic convolutional neural network (CNN) and multi-layer perception (MLP)
arXiv Detail & Related papers (2024-02-07T22:10:36Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - MAgNet: Mesh Agnostic Neural PDE Solver [68.8204255655161]
Climate predictions require fine-temporal resolutions to resolve all turbulent scales in the fluid simulations.
Current numerical model solveers PDEs on grids that are too coarse (3km to 200km on each side)
We design a novel architecture that predicts the spatially continuous solution of a PDE given a spatial position query.
arXiv Detail & Related papers (2022-10-11T14:52:20Z) - Deep convolutional surrogates and degrees of freedom in thermal design [0.0]
Convolutional Neural Networks (CNNs) are used to predict results of Computational Fluid Dynamics (CFD) directly from topologies saved as images.
We present surrogate models for heat transfer and pressure drop prediction of complex fin geometries generated using composite Bezier curves.
arXiv Detail & Related papers (2022-08-16T00:45:39Z) - Increase and Conquer: Training Graph Neural Networks on Growing Graphs [116.03137405192356]
We consider the problem of learning a graphon neural network (WNN) by training GNNs on graphs sampled Bernoulli from the graphon.
Inspired by these results, we propose an algorithm to learn GNNs on large-scale graphs that, starting from a moderate number of nodes, successively increases the size of the graph during training.
arXiv Detail & Related papers (2021-06-07T15:05:59Z) - Combining Differentiable PDE Solvers and Graph Neural Networks for Fluid
Flow Prediction [79.81193813215872]
We develop a hybrid (graph) neural network that combines a traditional graph convolutional network with an embedded differentiable fluid dynamics simulator inside the network itself.
We show that we can both generalize well to new situations and benefit from the substantial speedup of neural network CFD predictions.
arXiv Detail & Related papers (2020-07-08T21:23:19Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.