Operator-learning-inspired Modeling of Neural Ordinary Differential
Equations
- URL: http://arxiv.org/abs/2312.10274v1
- Date: Sat, 16 Dec 2023 00:29:15 GMT
- Title: Operator-learning-inspired Modeling of Neural Ordinary Differential
Equations
- Authors: Woojin Cho, Seunghyeon Cho, Hyundong Jin, Jinsung Jeon, Kookjin Lee,
Sanghyun Hong, Dongeun Lee, Jonghyun Choi, Noseong Park
- Abstract summary: We present a neural operator-based method to define the time-derivative term.
In our experiments with general downstream tasks, our method significantly outperforms existing methods.
- Score: 38.17903151426809
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural ordinary differential equations (NODEs), one of the most influential
works of the differential equation-based deep learning, are to continuously
generalize residual networks and opened a new field. They are currently
utilized for various downstream tasks, e.g., image classification, time series
classification, image generation, etc. Its key part is how to model the
time-derivative of the hidden state, denoted dh(t)/dt. People have habitually
used conventional neural network architectures, e.g., fully-connected layers
followed by non-linear activations. In this paper, however, we present a neural
operator-based method to define the time-derivative term. Neural operators were
initially proposed to model the differential operator of partial differential
equations (PDEs). Since the time-derivative of NODEs can be understood as a
special type of the differential operator, our proposed method, called branched
Fourier neural operator (BFNO), makes sense. In our experiments with general
downstream tasks, our method significantly outperforms existing methods.
Related papers
- Neural Operators with Localized Integral and Differential Kernels [77.76991758980003]
We present a principled approach to operator learning that can capture local features under two frameworks.
We prove that we obtain differential operators under an appropriate scaling of the kernel values of CNNs.
To obtain local integral operators, we utilize suitable basis representations for the kernels based on discrete-continuous convolutions.
arXiv Detail & Related papers (2024-02-26T18:59:31Z) - Approximating Numerical Fluxes Using Fourier Neural Operators for Hyperbolic Conservation Laws [7.438389089520601]
neural network-based methods, such as physics-informed neural networks (PINNs) and neural operators, exhibit deficiencies in robustness and generalization.
In this study, we focus on hyperbolic conservation laws by replacing traditional numerical flux with neural operators.
Our approach combines the strengths of both traditional numerical schemes and FNOs, outperforming standard FNO methods in several respects.
arXiv Detail & Related papers (2024-01-03T15:16:25Z) - PMNN:Physical Model-driven Neural Network for solving time-fractional
differential equations [17.66402435033991]
An innovative Physical Model-driven Neural Network (PMNN) method is proposed to solve time-fractional differential equations.
It effectively combines deep neural networks (DNNs) with approximation of fractional derivatives.
arXiv Detail & Related papers (2023-10-07T12:43:32Z) - Semi-supervised Learning of Partial Differential Operators and Dynamical
Flows [68.77595310155365]
We present a novel method that combines a hyper-network solver with a Fourier Neural Operator architecture.
We test our method on various time evolution PDEs, including nonlinear fluid flows in one, two, and three spatial dimensions.
The results show that the new method improves the learning accuracy at the time point of supervision point, and is able to interpolate and the solutions to any intermediate time.
arXiv Detail & Related papers (2022-07-28T19:59:14Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - Pseudo-Differential Neural Operator: Generalized Fourier Neural Operator
for Learning Solution Operators of Partial Differential Equations [14.43135909469058]
We propose a novel textitpseudo-differential integral operator (PDIO) to analyze and generalize the Fourier integral operator in FNO.
We experimentally validate the effectiveness of the proposed model by utilizing Darcy flow and the Navier-Stokes equation.
arXiv Detail & Related papers (2022-01-28T07:22:32Z) - Neural Operator: Learning Maps Between Function Spaces [75.93843876663128]
We propose a generalization of neural networks to learn operators, termed neural operators, that map between infinite dimensional function spaces.
We prove a universal approximation theorem for our proposed neural operator, showing that it can approximate any given nonlinear continuous operator.
An important application for neural operators is learning surrogate maps for the solution operators of partial differential equations.
arXiv Detail & Related papers (2021-08-19T03:56:49Z) - Fourier Neural Operator for Parametric Partial Differential Equations [57.90284928158383]
We formulate a new neural operator by parameterizing the integral kernel directly in Fourier space.
We perform experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation.
It is up to three orders of magnitude faster compared to traditional PDE solvers.
arXiv Detail & Related papers (2020-10-18T00:34:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.