GRIT: Graph Transformer For Internal Ice Layer Thickness Prediction
- URL: http://arxiv.org/abs/2507.07388v1
- Date: Thu, 10 Jul 2025 02:59:21 GMT
- Title: GRIT: Graph Transformer For Internal Ice Layer Thickness Prediction
- Authors: Zesheng Liu, Maryam Rahnemoonfar,
- Abstract summary: Radar sensors, capable of penetrating ice, capture detailed radargram images of internal ice layers.<n>GRIT integrates an inductive geometric graph learning framework with an attention mechanism to map the relationships between shallow and deeper ice layers.
- Score: 0.7673339435080445
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Gaining a deeper understanding of the thickness and variability of internal ice layers in Radar imagery is essential in monitoring the snow accumulation, better evaluating ice dynamics processes, and minimizing uncertainties in climate models. Radar sensors, capable of penetrating ice, capture detailed radargram images of internal ice layers. In this work, we introduce GRIT, graph transformer for ice layer thickness. GRIT integrates an inductive geometric graph learning framework with an attention mechanism, designed to map the relationships between shallow and deeper ice layers. Compared to baseline graph neural networks, GRIT demonstrates consistently lower prediction errors. These results highlight the attention mechanism's effectiveness in capturing temporal changes across ice layers, while the graph transformer combines the strengths of transformers for learning long-range dependencies with graph neural networks for capturing spatial patterns, enabling robust modeling of complex spatiotemporal dynamics.
Related papers
- ST-GRIT: Spatio-Temporal Graph Transformer For Internal Ice Layer Thickness Prediction [0.7673339435080445]
thickness and variability of internal ice layers in radar imagery is crucial for monitoring snow accumulation, assessing ice dynamics and reducing uncertainties in climate models.<n>In this work, we present-temporal graph for ice layer thickness, designed to process radar and capture relationships between shallow and deep ice layers.<n> ST-GRIT consistently outperforms current state-of-the-art methods and other baseline graph neural networks by achieving lower root mean-squared error.
arXiv Detail & Related papers (2025-07-10T03:06:01Z) - AI-ready Snow Radar Echogram Dataset (SRED) for climate change monitoring [0.32985979395737786]
This study introduces the first comprehensive radar echogram dataset derived from Snow Radar airborne data collected in 2012.<n>To demonstrate its utility, we evaluated the performance of five deep learning models on the dataset.
arXiv Detail & Related papers (2025-05-01T18:29:36Z) - Point Cloud Denoising With Fine-Granularity Dynamic Graph Convolutional Networks [58.050130177241186]
Noise perturbations often corrupt 3-D point clouds, hindering downstream tasks such as surface reconstruction, rendering, and further processing.
This paper introduces finegranularity dynamic graph convolutional networks called GDGCN, a novel approach to denoising in 3-D point clouds.
arXiv Detail & Related papers (2024-11-21T14:19:32Z) - Multi-branch Spatio-Temporal Graph Neural Network For Efficient Ice Layer Thickness Prediction [0.7673339435080445]
We developed a multi-branch-temporal graph neural network that used the GraphSS framework for learning and a temporal convolution operation to capture temporal changes.
We found that our proposed multi-branch network can consistently outperform the current fused-temporal graph neural network in both accuracy and efficiency.
arXiv Detail & Related papers (2024-11-06T16:59:51Z) - SGFormer: Single-Layer Graph Transformers with Approximation-Free Linear Complexity [74.51827323742506]
We evaluate the necessity of adopting multi-layer attentions in Transformers on graphs.
We show that one-layer propagation can be reduced to one-layer propagation, with the same capability for representation learning.
It suggests a new technical path for building powerful and efficient Transformers on graphs.
arXiv Detail & Related papers (2024-09-13T17:37:34Z) - Learning Spatio-Temporal Patterns of Polar Ice Layers With Physics-Informed Graph Neural Network [0.7673339435080445]
We propose a physics-informed hybrid graph neural network that combines the GraphSAGE framework for graph feature learning with the long short-term memory (LSTM) structure for learning temporal changes.
We found that our network can consistently outperform the current non-inductive or non-physical model in predicting deep ice layer thickness.
arXiv Detail & Related papers (2024-06-21T16:41:02Z) - RobGC: Towards Robust Graph Condensation [61.259453496191696]
Graph neural networks (GNNs) have attracted widespread attention for their impressive capability of graph representation learning.<n>However, the increasing prevalence of large-scale graphs presents a significant challenge for GNN training due to their computational demands.<n>We propose graph condensation (GC) to generate an informative compact graph that enables efficient training of GNNs while retaining performance.
arXiv Detail & Related papers (2024-06-19T04:14:57Z) - Dirichlet Energy Enhancement of Graph Neural Networks by Framelet
Augmentation [19.56268823452656]
We introduce a framelet system into the analysis of Dirichlet energy and take a multi-scale perspective to leverage the Dirichlet energy.
Based on that, we design the Energy Enhanced Convolution (EEConv), which is an effective and practical operation.
Experiments show that deep GNNs with EEConv achieve state-of-the-art performance over various node classification datasets.
arXiv Detail & Related papers (2023-11-09T22:22:18Z) - Gradient Gating for Deep Multi-Rate Learning on Graphs [62.25886489571097]
We present Gradient Gating (G$2$), a novel framework for improving the performance of Graph Neural Networks (GNNs)
Our framework is based on gating the output of GNN layers with a mechanism for multi-rate flow of message passing information across nodes of the underlying graph.
arXiv Detail & Related papers (2022-10-02T13:19:48Z) - Spectral Graph Convolutional Networks With Lifting-based Adaptive Graph
Wavelets [81.63035727821145]
Spectral graph convolutional networks (SGCNs) have been attracting increasing attention in graph representation learning.
We propose a novel class of spectral graph convolutional networks that implement graph convolutions with adaptive graph wavelets.
arXiv Detail & Related papers (2021-08-03T17:57:53Z) - Dirichlet Graph Variational Autoencoder [65.94744123832338]
We present Dirichlet Graph Variational Autoencoder (DGVAE) with graph cluster memberships as latent factors.
Motivated by the low pass characteristics in balanced graph cut, we propose a new variant of GNN named Heatts to encode the input graph into cluster memberships.
arXiv Detail & Related papers (2020-10-09T07:35:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.