Self-Consistent Velocity Matching of Probability Flows
- URL: http://arxiv.org/abs/2301.13737v4
- Date: Tue, 14 Nov 2023 04:44:08 GMT
- Title: Self-Consistent Velocity Matching of Probability Flows
- Authors: Lingxiao Li, Samuel Hurault, Justin Solomon
- Abstract summary: We present a discretization-free scalable framework for solving a class of partial differential equations (PDEs)
The main observation is that the time-varying velocity field of the PDE solution needs to be self-consistent.
We use an iterative formulation with a biased gradient estimator that bypasses significant computational obstacles with strong empirical performance.
- Score: 22.2542921090435
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: We present a discretization-free scalable framework for solving a large class
of mass-conserving partial differential equations (PDEs), including the
time-dependent Fokker-Planck equation and the Wasserstein gradient flow. The
main observation is that the time-varying velocity field of the PDE solution
needs to be self-consistent: it must satisfy a fixed-point equation involving
the probability flow characterized by the same velocity field. Instead of
directly minimizing the residual of the fixed-point equation with neural
parameterization, we use an iterative formulation with a biased gradient
estimator that bypasses significant computational obstacles with strong
empirical performance. Compared to existing approaches, our method does not
suffer from temporal or spatial discretization, covers a wider range of PDEs,
and scales to high dimensions. Experimentally, our method recovers analytical
solutions accurately when they are available and achieves superior performance
in high dimensions with less training time compared to alternatives.
Related papers
- A Simulation-Free Deep Learning Approach to Stochastic Optimal Control [12.699529713351287]
We propose a simulation-free algorithm for the solution of generic problems in optimal control (SOC)
Unlike existing methods, our approach does not require the solution of an adjoint problem.
arXiv Detail & Related papers (2024-10-07T16:16:53Z) - Total Uncertainty Quantification in Inverse PDE Solutions Obtained with Reduced-Order Deep Learning Surrogate Models [50.90868087591973]
We propose an approximate Bayesian method for quantifying the total uncertainty in inverse PDE solutions obtained with machine learning surrogate models.
We test the proposed framework by comparing it with the iterative ensemble smoother and deep ensembling methods for a non-linear diffusion equation.
arXiv Detail & Related papers (2024-08-20T19:06:02Z) - Learning Semilinear Neural Operators : A Unified Recursive Framework For Prediction And Data Assimilation [21.206744437644982]
We propose a learning-based state-space approach to compute solution operators to infinite-dimensional semilinear PDEs.
We develop a flexible method that allows for both prediction and data assimilation by combining prediction and correction operations.
We show through experiments on Kuramoto-Sivashinsky, Navier-Stokes and Korteweg-de Vries equations that the proposed model is robust to noise and can leverage arbitrary amounts of measurements to correct its prediction over a long time horizon with little computational overhead.
arXiv Detail & Related papers (2024-02-24T00:10:51Z) - Amortized Reparametrization: Efficient and Scalable Variational
Inference for Latent SDEs [3.2634122554914002]
We consider the problem of inferring latent differential equations with a time and memory cost that scales independently with the amount of data, the total length of the time series, and the stiffness of the approximate differential equations.
This is in stark contrast to typical methods for inferring latent differential equations which, despite their constant memory cost, have a time complexity that is heavily dependent on the stiffness of the approximate differential equation.
arXiv Detail & Related papers (2023-12-16T22:27:36Z) - Non-Parametric Learning of Stochastic Differential Equations with Non-asymptotic Fast Rates of Convergence [65.63201894457404]
We propose a novel non-parametric learning paradigm for the identification of drift and diffusion coefficients of non-linear differential equations.
The key idea essentially consists of fitting a RKHS-based approximation of the corresponding Fokker-Planck equation to such observations.
arXiv Detail & Related papers (2023-05-24T20:43:47Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.
We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.
Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Probability flow solution of the Fokker-Planck equation [10.484851004093919]
We introduce an alternative scheme based on integrating an ordinary differential equation that describes the flow of probability.
Unlike the dynamics, this equation deterministically pushes samples from the initial density onto samples from the solution at any later time.
Our approach is based on recent advances in score-based diffusion for generative modeling.
arXiv Detail & Related papers (2022-06-09T17:37:09Z) - Self-Consistency of the Fokker-Planck Equation [117.17004717792344]
The Fokker-Planck equation governs the density evolution of the Ito process.
Ground-truth velocity field can be shown to be the solution of a fixed-point equation.
In this paper, we exploit this concept to design a potential function of the hypothesis velocity fields.
arXiv Detail & Related papers (2022-06-02T03:44:23Z) - Probabilistic Numerical Method of Lines for Time-Dependent Partial
Differential Equations [20.86460521113266]
Current state-of-the-art PDE solvers treat the space- and time-dimensions separately, serially, and with black-box algorithms.
We introduce a probabilistic version of a technique called method of lines to fix this issue.
Joint quantification of space- and time-uncertainty becomes possible without losing the performance benefits of well-tuned ODE solvers.
arXiv Detail & Related papers (2021-10-22T15:26:05Z) - Large-Scale Wasserstein Gradient Flows [84.73670288608025]
We introduce a scalable scheme to approximate Wasserstein gradient flows.
Our approach relies on input neural networks (ICNNs) to discretize the JKO steps.
As a result, we can sample from the measure at each step of the gradient diffusion and compute its density.
arXiv Detail & Related papers (2021-06-01T19:21:48Z) - DiffPD: Differentiable Projective Dynamics with Contact [65.88720481593118]
We present DiffPD, an efficient differentiable soft-body simulator with implicit time integration.
We evaluate the performance of DiffPD and observe a speedup of 4-19 times compared to the standard Newton's method in various applications.
arXiv Detail & Related papers (2021-01-15T00:13:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.