Multi-branch Spatio-Temporal Graph Neural Network For Efficient Ice Layer Thickness Prediction
- URL: http://arxiv.org/abs/2411.04055v1
- Date: Wed, 06 Nov 2024 16:59:51 GMT
- Title: Multi-branch Spatio-Temporal Graph Neural Network For Efficient Ice Layer Thickness Prediction
- Authors: Zesheng Liu, Maryam Rahnemoonfar,
- Abstract summary: We developed a multi-branch-temporal graph neural network that used the GraphSS framework for learning and a temporal convolution operation to capture temporal changes.
We found that our proposed multi-branch network can consistently outperform the current fused-temporal graph neural network in both accuracy and efficiency.
- Score: 0.7673339435080445
- License:
- Abstract: Understanding spatio-temporal patterns in polar ice layers is essential for tracking changes in ice sheet balance and assessing ice dynamics. While convolutional neural networks are widely used in learning ice layer patterns from raw echogram images captured by airborne snow radar sensors, noise in the echogram images prevents researchers from getting high-quality results. Instead, we focus on geometric deep learning using graph neural networks, aiming to build a spatio-temporal graph neural network that learns from thickness information of the top ice layers and predicts for deeper layers. In this paper, we developed a novel multi-branch spatio-temporal graph neural network that used the GraphSAGE framework for spatio features learning and a temporal convolution operation to capture temporal changes, enabling different branches of the network to be more specialized and focusing on a single learning task. We found that our proposed multi-branch network can consistently outperform the current fused spatio-temporal graph neural network in both accuracy and efficiency.
Related papers
- Peer-to-Peer Learning Dynamics of Wide Neural Networks [10.179711440042123]
We provide an explicit, non-asymptotic characterization of the learning dynamics of wide neural networks trained using popularDGD algorithms.
We validate our analytical results by accurately predicting error and error and for classification tasks.
arXiv Detail & Related papers (2024-09-23T17:57:58Z) - Learning Spatio-Temporal Patterns of Polar Ice Layers With Physics-Informed Graph Neural Network [0.7673339435080445]
We propose a physics-informed hybrid graph neural network that combines the GraphSAGE framework for graph feature learning with the long short-term memory (LSTM) structure for learning temporal changes.
We found that our network can consistently outperform the current non-inductive or non-physical model in predicting deep ice layer thickness.
arXiv Detail & Related papers (2024-06-21T16:41:02Z) - Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Addressing caveats of neural persistence with deep graph persistence [54.424983583720675]
We find that the variance of network weights and spatial concentration of large weights are the main factors that impact neural persistence.
We propose an extension of the filtration underlying neural persistence to the whole neural network instead of single layers.
This yields our deep graph persistence measure, which implicitly incorporates persistent paths through the network and alleviates variance-related issues.
arXiv Detail & Related papers (2023-07-20T13:34:11Z) - Efficient-Dyn: Dynamic Graph Representation Learning via Event-based
Temporal Sparse Attention Network [2.0047096160313456]
Dynamic graph neural networks have received more and more attention from researchers.
We propose a novel dynamic graph neural network, Efficient-Dyn.
It adaptively encodes temporal information into a sequence of patches with an equal amount of temporal-topological structure.
arXiv Detail & Related papers (2022-01-04T23:52:24Z) - Learning through structure: towards deep neuromorphic knowledge graph
embeddings [0.5906031288935515]
We propose a strategy to map deep graph learning architectures for knowledge graph reasoning to neuromorphic architectures.
Based on the insight that randomly and untrained graph neural networks are able to preserve local graph structures, we compose a frozen neural network shallow knowledge graph embedding models.
We experimentally show that already on conventional computing hardware, this leads to a significant speedup and memory reduction while maintaining a competitive performance level.
arXiv Detail & Related papers (2021-09-21T18:01:04Z) - Spatio-Temporal Inception Graph Convolutional Networks for
Skeleton-Based Action Recognition [126.51241919472356]
We design a simple and highly modularized graph convolutional network architecture for skeleton-based action recognition.
Our network is constructed by repeating a building block that aggregates multi-granularity information from both the spatial and temporal paths.
arXiv Detail & Related papers (2020-11-26T14:43:04Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - Graph Structure of Neural Networks [104.33754950606298]
We show how the graph structure of neural networks affect their predictive performance.
A "sweet spot" of relational graphs leads to neural networks with significantly improved predictive performance.
Top-performing neural networks have graph structure surprisingly similar to those of real biological neural networks.
arXiv Detail & Related papers (2020-07-13T17:59:31Z) - Hcore-Init: Neural Network Initialization based on Graph Degeneracy [22.923756039561194]
We propose an adapted version of the k-core structure for the complete weighted multipartite graph extracted from a deep learning architecture.
As a multipartite graph is a combination of bipartite graphs, that are in turn the incidence graphs of hypergraphs, we design k-hypercore decomposition.
arXiv Detail & Related papers (2020-04-16T12:57:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.