The World as a Graph: Improving El Ni\~no Forecasts with Graph Neural
Networks
- URL: http://arxiv.org/abs/2104.05089v1
- Date: Sun, 11 Apr 2021 19:55:55 GMT
- Title: The World as a Graph: Improving El Ni\~no Forecasts with Graph Neural
Networks
- Authors: Salva R\"uhling Cachay, Emma Erickson, Arthur Fender C. Bucker, Ernest
Pokropek, Willa Potosnak, Suyash Bire, Salomey Osei, Bj\"orn L\"utjens
- Abstract summary: We propose the first application of graph neural networks to seasonal forecasting.
Our model, graphino, outperforms state-of-the-art deep learning-based models for forecasts up to six months ahead.
- Score: 0.00916150060695978
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Deep learning-based models have recently outperformed state-of-the-art
seasonal forecasting models, such as for predicting El Ni\~no-Southern
Oscillation (ENSO). However, current deep learning models are based on
convolutional neural networks which are difficult to interpret and can fail to
model large-scale atmospheric patterns. In comparison, graph neural networks
(GNNs) are capable of modeling large-scale spatial dependencies and are more
interpretable due to the explicit modeling of information flow through edge
connections. We propose the first application of graph neural networks to
seasonal forecasting. We design a novel graph connectivity learning module that
enables our GNN model to learn large-scale spatial interactions jointly with
the actual ENSO forecasting task. Our model, \graphino, outperforms
state-of-the-art deep learning-based models for forecasts up to six months
ahead. Additionally, we show that our model is more interpretable as it learns
sensible connectivity structures that correlate with the ENSO anomaly pattern.
Related papers
- Label Deconvolution for Node Representation Learning on Large-scale
Attributed Graphs against Learning Bias [75.44877675117749]
We propose an efficient label regularization technique, namely Label Deconvolution (LD), to alleviate the learning bias by a novel and highly scalable approximation to the inverse mapping of GNNs.
Experiments demonstrate LD significantly outperforms state-of-the-art methods on Open Graph datasets Benchmark.
arXiv Detail & Related papers (2023-09-26T13:09:43Z) - Graph-enabled Reinforcement Learning for Time Series Forecasting with
Adaptive Intelligence [11.249626785206003]
We propose a novel approach for predicting time-series data using Graphical neural network (GNN) and monitoring with Reinforcement Learning (RL)
GNNs are able to explicitly incorporate the graph structure of the data into the model, allowing them to capture temporal dependencies in a more natural way.
This approach allows for more accurate predictions in complex temporal structures, such as those found in healthcare, traffic and weather forecasting.
arXiv Detail & Related papers (2023-09-18T22:25:12Z) - LazyGNN: Large-Scale Graph Neural Networks via Lazy Propagation [51.552170474958736]
We propose to capture long-distance dependency in graphs by shallower models instead of deeper models, which leads to a much more efficient model, LazyGNN, for graph representation learning.
LazyGNN is compatible with existing scalable approaches (such as sampling methods) for further accelerations through the development of mini-batch LazyGNN.
Comprehensive experiments demonstrate its superior prediction performance and scalability on large-scale benchmarks.
arXiv Detail & Related papers (2023-02-03T02:33:07Z) - Data-Free Adversarial Knowledge Distillation for Graph Neural Networks [62.71646916191515]
We propose the first end-to-end framework for data-free adversarial knowledge distillation on graph structured data (DFAD-GNN)
To be specific, our DFAD-GNN employs a generative adversarial network, which mainly consists of three components: a pre-trained teacher model and a student model are regarded as two discriminators, and a generator is utilized for deriving training graphs to distill knowledge from the teacher model into the student model.
Our DFAD-GNN significantly surpasses state-of-the-art data-free baselines in the graph classification task.
arXiv Detail & Related papers (2022-05-08T08:19:40Z) - EIGNN: Efficient Infinite-Depth Graph Neural Networks [51.97361378423152]
Graph neural networks (GNNs) are widely used for modelling graph-structured data in numerous applications.
Motivated by this limitation, we propose a GNN model with infinite depth, which we call Efficient Infinite-Depth Graph Neural Networks (EIGNN)
We show that EIGNN has a better ability to capture long-range dependencies than recent baselines, and consistently achieves state-of-the-art performance.
arXiv Detail & Related papers (2022-02-22T08:16:58Z) - Significant Wave Height Prediction based on Wavelet Graph Neural Network [2.8383948890824913]
"Soft computing" approaches, including machine learning and deep learning models, have shown numerous success in recent years.
A Wavelet Graph Neural Network (WGNN) approach is proposed to integrate the advantages of wavelet transform and graph neural network.
Experimental results show that the proposed WGNN approach outperforms other models, including the numerical models, the machine learning models, and several deep learning models.
arXiv Detail & Related papers (2021-07-20T13:34:48Z) - Sparse Flows: Pruning Continuous-depth Models [107.98191032466544]
We show that pruning improves generalization for neural ODEs in generative modeling.
We also show that pruning finds minimal and efficient neural ODE representations with up to 98% less parameters compared to the original network, without loss of accuracy.
arXiv Detail & Related papers (2021-06-24T01:40:17Z) - A Deep Latent Space Model for Graph Representation Learning [10.914558012458425]
We propose a Deep Latent Space Model (DLSM) for directed graphs to incorporate the traditional latent variable based generative model into deep learning frameworks.
Our proposed model consists of a graph convolutional network (GCN) encoder and a decoder, which are layer-wise connected by a hierarchical variational auto-encoder architecture.
Experiments on real-world datasets show that the proposed model achieves the state-of-the-art performances on both link prediction and community detection tasks.
arXiv Detail & Related papers (2021-06-22T12:41:19Z) - Graph Neural Networks for Improved El Ni\~no Forecasting [0.009620910657090186]
We propose an application of Graph Neural Networks (GNN) to forecast El Nino-Southern Oscillation (ENSO) at long lead times.
Preliminary results are promising and outperform state-of-the-art systems for projections 1 and 3 months ahead.
arXiv Detail & Related papers (2020-12-02T23:40:53Z) - Streaming Graph Neural Networks via Continual Learning [31.810308087441445]
Graph neural networks (GNNs) have achieved strong performance in various applications.
In this paper, we propose a streaming GNN model based on continual learning.
We show that our model can efficiently update model parameters and achieve comparable performance to model retraining.
arXiv Detail & Related papers (2020-09-23T06:52:30Z) - Learning to Extrapolate Knowledge: Transductive Few-shot Out-of-Graph
Link Prediction [69.1473775184952]
We introduce a realistic problem of few-shot out-of-graph link prediction.
We tackle this problem with a novel transductive meta-learning framework.
We validate our model on multiple benchmark datasets for knowledge graph completion and drug-drug interaction prediction.
arXiv Detail & Related papers (2020-06-11T17:42:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.