Climate Modeling with Neural Diffusion Equations
- URL: http://arxiv.org/abs/2111.06011v1
- Date: Thu, 11 Nov 2021 01:48:46 GMT
- Title: Climate Modeling with Neural Diffusion Equations
- Authors: Jeehyun Hwang, Jeongwhan Choi, Hwangyong Choi, Kookjin Lee, Dongeun
Lee, Noseong Park
- Abstract summary: We design a novel climate model based on the neural ordinary differential equation (NODE) and the diffusion equation.
Our method consistently outperforms existing baselines by non-trivial margins.
- Score: 3.8521112392276
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Owing to the remarkable development of deep learning technology, there have
been a series of efforts to build deep learning-based climate models. Whereas
most of them utilize recurrent neural networks and/or graph neural networks, we
design a novel climate model based on the two concepts, the neural ordinary
differential equation (NODE) and the diffusion equation. Many physical
processes involving a Brownian motion of particles can be described by the
diffusion equation and as a result, it is widely used for modeling climate. On
the other hand, neural ordinary differential equations (NODEs) are to learn a
latent governing equation of ODE from data. In our presented method, we combine
them into a single framework and propose a concept, called neural diffusion
equation (NDE). Our NDE, equipped with the diffusion equation and one more
additional neural network to model inherent uncertainty, can learn an
appropriate latent governing equation that best describes a given climate
dataset. In our experiments with two real-world and one synthetic datasets and
eleven baselines, our method consistently outperforms existing baselines by
non-trivial margins.
Related papers
- Diffusion models as probabilistic neural operators for recovering unobserved states of dynamical systems [49.2319247825857]
We show that diffusion-based generative models exhibit many properties favourable for neural operators.
We propose to train a single model adaptable to multiple tasks, by alternating between the tasks during training.
arXiv Detail & Related papers (2024-05-11T21:23:55Z) - Uncertainty and Structure in Neural Ordinary Differential Equations [28.12033356095061]
We show that basic and lightweight Bayesian deep learning techniques like the Laplace approximation can be applied to neural ODEs.
We explore how mechanistic knowledge and uncertainty quantification interact on two recently proposed neural ODE frameworks.
arXiv Detail & Related papers (2023-05-22T17:50:42Z) - Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - Learning differential equations from data [0.0]
In recent times, due to the abundance of data, there is an active search for data-driven methods to learn Differential equation models from data.
We propose a forward-Euler based neural network model and test its performance by learning ODEs from data using different number of hidden layers and different neural network width.
arXiv Detail & Related papers (2022-05-23T17:36:28Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - On Neural Differential Equations [13.503274710499971]
In particular, neural differential equations (NDEs) demonstrate that neural networks and differential equations are two sides of the same coin.
NDEs are suitable for tackling generative problems, dynamical systems, and time series.
NDEs offer high-capacity function approximation, strong priors on model space, the ability to handle irregular data, memory efficiency, and a wealth of available theory on both sides.
arXiv Detail & Related papers (2022-02-04T23:32:29Z) - NeuralPDE: Modelling Dynamical Systems from Data [0.44259821861543996]
We propose NeuralPDE, a model which combines convolutional neural networks (CNNs) with differentiable ODE solvers to model dynamical systems.
We show that the Method of Lines used in standard PDE solvers can be represented using convolutions which makes CNNs the natural choice to parametrize arbitrary PDE dynamics.
Our model can be applied to any data without requiring any prior knowledge about the governing PDE.
arXiv Detail & Related papers (2021-11-15T10:59:52Z) - Fourier Neural Operator for Parametric Partial Differential Equations [57.90284928158383]
We formulate a new neural operator by parameterizing the integral kernel directly in Fourier space.
We perform experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation.
It is up to three orders of magnitude faster compared to traditional PDE solvers.
arXiv Detail & Related papers (2020-10-18T00:34:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.