Log Neural Controlled Differential Equations: The Lie Brackets Make a Difference
- URL: http://arxiv.org/abs/2402.18512v3
- Date: Mon, 28 Oct 2024 09:55:56 GMT
- Title: Log Neural Controlled Differential Equations: The Lie Brackets Make a Difference
- Authors: Benjamin Walker, Andrew D. McLeod, Tiexin Qin, Yichuan Cheng, Haoliang Li, Terry Lyons,
- Abstract summary: Neural CDEs (NCDEs) treat time series data as observations from a control path.
We introduce Log-NCDEs, a novel, effective, and efficient method for training NCDEs.
- Score: 22.224853384201595
- License:
- Abstract: The vector field of a controlled differential equation (CDE) describes the relationship between a control path and the evolution of a solution path. Neural CDEs (NCDEs) treat time series data as observations from a control path, parameterise a CDE's vector field using a neural network, and use the solution path as a continuously evolving hidden state. As their formulation makes them robust to irregular sampling rates, NCDEs are a powerful approach for modelling real-world data. Building on neural rough differential equations (NRDEs), we introduce Log-NCDEs, a novel, effective, and efficient method for training NCDEs. The core component of Log-NCDEs is the Log-ODE method, a tool from the study of rough paths for approximating a CDE's solution. Log-NCDEs are shown to outperform NCDEs, NRDEs, the linear recurrent unit, S5, and MAMBA on a range of multivariate time series datasets with up to $50{,}000$ observations.
Related papers
- Solving Poisson Equations using Neural Walk-on-Spheres [80.1675792181381]
We propose Neural Walk-on-Spheres (NWoS), a novel neural PDE solver for the efficient solution of high-dimensional Poisson equations.
We demonstrate the superiority of NWoS in accuracy, speed, and computational costs.
arXiv Detail & Related papers (2024-06-05T17:59:22Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Elucidating the solution space of extended reverse-time SDE for
diffusion models [54.23536653351234]
Diffusion models (DMs) demonstrate potent image generation capabilities in various generative modeling tasks.
Their primary limitation lies in slow sampling speed, requiring hundreds or thousands of sequential function evaluations to generate high-quality images.
We formulate the sampling process as an extended reverse-time SDE, unifying prior explorations into ODEs and SDEs.
We devise fast and training-free samplers, ER-SDE-rs, achieving state-of-the-art performance across all samplers.
arXiv Detail & Related papers (2023-09-12T12:27:17Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.
We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.
Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Learnable Path in Neural Controlled Differential Equations [11.38331901271794]
Neural controlled differential equations (NCDEs) are a specialized model in (irregular) time-series processing.
We present a method to generate another latent path, which is identical to learning an appropriate method.
We design an encoder-decoder module based on NCDEs and NODEs, and a special training method for it.
arXiv Detail & Related papers (2023-01-11T07:05:27Z) - A Kernel Approach for PDE Discovery and Operator Learning [9.463496582811633]
kernel smoothing is utilized to denoise the data and approximate derivatives of the solution.
The learned PDE is then used within a kernel based solver to approximate the solution of the PDE with a new source/boundary term.
arXiv Detail & Related papers (2022-10-14T22:33:28Z) - Learning time-dependent PDE solver using Message Passing Graph Neural
Networks [0.0]
We introduce a graph neural network approach to finding efficient PDE solvers through learning using message-passing models.
We use graphs to represent PDE-data on an unstructured mesh and show that message passing graph neural networks (MPGNN) can parameterize governing equations.
We show that a recurrent graph neural network approach can find a temporal sequence of solutions to a PDE.
arXiv Detail & Related papers (2022-04-15T21:10:32Z) - Lie Point Symmetry Data Augmentation for Neural PDE Solvers [69.72427135610106]
We present a method, which can partially alleviate this problem, by improving neural PDE solver sample complexity.
In the context of PDEs, it turns out that we are able to quantitatively derive an exhaustive list of data transformations.
We show how it can easily be deployed to improve neural PDE solver sample complexity by an order of magnitude.
arXiv Detail & Related papers (2022-02-15T18:43:17Z) - Score-based Generative Modeling of Graphs via the System of Stochastic
Differential Equations [57.15855198512551]
We propose a novel score-based generative model for graphs with a continuous-time framework.
We show that our method is able to generate molecules that lie close to the training distribution yet do not violate the chemical valency rule.
arXiv Detail & Related papers (2022-02-05T08:21:04Z) - Neural Rough Differential Equations for Long Time Series [19.004296236396947]
We use rough path theory to extend the formulation of Neural CDEs.
Instead of directly embedding into path space, we represent the input signal over small time intervals through its textitlog-signature
This is the approach for solving textitrough differential equations (RDEs)
arXiv Detail & Related papers (2020-09-17T13:43:47Z) - Identifying Latent Stochastic Differential Equations [29.103393300261587]
We present a method for learning latent differential equations (SDEs) from high-dimensional time series data.
The proposed method learns the mapping from ambient to latent space, and the underlying SDE coefficients, through a self-supervised learning approach.
We validate the method through several simulated video processing tasks, where the underlying SDE is known, and through real world datasets.
arXiv Detail & Related papers (2020-07-12T19:46:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.