Learnable Path in Neural Controlled Differential Equations
- URL: http://arxiv.org/abs/2301.04333v1
- Date: Wed, 11 Jan 2023 07:05:27 GMT
- Title: Learnable Path in Neural Controlled Differential Equations
- Authors: Sheo Yon Jhin, Minju Jo, Seungji Kook, Noseong Park, Sungpil Woo,
Sunhwan Lim
- Abstract summary: Neural controlled differential equations (NCDEs) are a specialized model in (irregular) time-series processing.
We present a method to generate another latent path, which is identical to learning an appropriate method.
We design an encoder-decoder module based on NCDEs and NODEs, and a special training method for it.
- Score: 11.38331901271794
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural controlled differential equations (NCDEs), which are continuous
analogues to recurrent neural networks (RNNs), are a specialized model in
(irregular) time-series processing. In comparison with similar models, e.g.,
neural ordinary differential equations (NODEs), the key distinctive
characteristics of NCDEs are i) the adoption of the continuous path created by
an interpolation algorithm from each raw discrete time-series sample and ii)
the adoption of the Riemann--Stieltjes integral. It is the continuous path
which makes NCDEs be analogues to continuous RNNs. However, NCDEs use existing
interpolation algorithms to create the path, which is unclear whether they can
create an optimal path. To this end, we present a method to generate another
latent path (rather than relying on existing interpolation algorithms), which
is identical to learning an appropriate interpolation method. We design an
encoder-decoder module based on NCDEs and NODEs, and a special training method
for it. Our method shows the best performance in both time-series
classification and forecasting.
Related papers
- Efficient Training of Neural Stochastic Differential Equations by Matching Finite Dimensional Distributions [3.889230974713832]
We develop a novel scoring rule for comparing continuous Markov processes.
This scoring rule allows us to bypass the computational overhead associated with signature kernels.
We demonstrate that FDM achieves superior performance, consistently outperforming existing methods in terms of both computational efficiency and generative quality.
arXiv Detail & Related papers (2024-10-04T23:26:38Z) - Log Neural Controlled Differential Equations: The Lie Brackets Make a Difference [22.224853384201595]
Neural CDEs (NCDEs) treat time series data as observations from a control path.
We introduce Log-NCDEs, a novel, effective, and efficient method for training NCDEs.
arXiv Detail & Related papers (2024-02-28T17:40:05Z) - Discrete Neural Algorithmic Reasoning [18.497863598167257]
We propose to force neural reasoners to maintain the execution trajectory as a combination of finite predefined states.
trained with supervision on the algorithm's state transitions, such models are able to perfectly align with the original algorithm.
arXiv Detail & Related papers (2024-02-18T16:03:04Z) - PMNN:Physical Model-driven Neural Network for solving time-fractional
differential equations [17.66402435033991]
An innovative Physical Model-driven Neural Network (PMNN) method is proposed to solve time-fractional differential equations.
It effectively combines deep neural networks (DNNs) with approximation of fractional derivatives.
arXiv Detail & Related papers (2023-10-07T12:43:32Z) - Neural Basis Functions for Accelerating Solutions to High Mach Euler
Equations [63.8376359764052]
We propose an approach to solving partial differential equations (PDEs) using a set of neural networks.
We regress a set of neural networks onto a reduced order Proper Orthogonal Decomposition (POD) basis.
These networks are then used in combination with a branch network that ingests the parameters of the prescribed PDE to compute a reduced order approximation to the PDE.
arXiv Detail & Related papers (2022-08-02T18:27:13Z) - Semi-supervised Learning of Partial Differential Operators and Dynamical
Flows [68.77595310155365]
We present a novel method that combines a hyper-network solver with a Fourier Neural Operator architecture.
We test our method on various time evolution PDEs, including nonlinear fluid flows in one, two, and three spatial dimensions.
The results show that the new method improves the learning accuracy at the time point of supervision point, and is able to interpolate and the solutions to any intermediate time.
arXiv Detail & Related papers (2022-07-28T19:59:14Z) - EXIT: Extrapolation and Interpolation-based Neural Controlled
Differential Equations for Time-series Classification and Forecasting [19.37382379378985]
neural controlled differential equations (NCDEs) are considered as a breakthrough in deep learning.
In this work, we enhance NCDEs by redesigning their core part, i.e., generating a continuous path from a discrete time-series input.
Our NCDE design can use both the extrapolation and the extrapolated information for downstream machine learning tasks.
arXiv Detail & Related papers (2022-04-19T09:37:36Z) - Incorporating NODE with Pre-trained Neural Differential Operator for
Learning Dynamics [73.77459272878025]
We propose to enhance the supervised signal in learning dynamics by pre-training a neural differential operator (NDO)
NDO is pre-trained on a class of symbolic functions, and it learns the mapping between the trajectory samples of these functions to their derivatives.
We provide theoretical guarantee on that the output of NDO can well approximate the ground truth derivatives by proper tuning the complexity of the library.
arXiv Detail & Related papers (2021-06-08T08:04:47Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Interpolation Technique to Speed Up Gradients Propagation in Neural ODEs [71.26657499537366]
We propose a simple literature-based method for the efficient approximation of gradients in neural ODE models.
We compare it with the reverse dynamic method to train neural ODEs on classification, density estimation, and inference approximation tasks.
arXiv Detail & Related papers (2020-03-11T13:15:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.