Doob's Lagrangian: A Sample-Efficient Variational Approach to Transition Path Sampling
- URL: http://arxiv.org/abs/2410.07974v4
- Date: Tue, 10 Dec 2024 01:57:00 GMT
- Title: Doob's Lagrangian: A Sample-Efficient Variational Approach to Transition Path Sampling
- Authors: Yuanqi Du, Michael Plainer, Rob Brekelmans, Chenru Duan, Frank Noé, Carla P. Gomes, Alán Aspuru-Guzik, Kirill Neklyudov,
- Abstract summary: We propose a variational formulation of Doob's h-transform as an optimization problem over trajectories between a given initial point and the desired ending point.<n>Our approach significantly reduces the search space over trajectories and avoids expensive trajectory simulation.<n>We demonstrate the ability of our method to find feasible transition paths on real-world molecular simulation and protein folding tasks.
- Score: 34.853443523585604
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Rare event sampling in dynamical systems is a fundamental problem arising in the natural sciences, which poses significant computational challenges due to an exponentially large space of trajectories. For settings where the dynamical system of interest follows a Brownian motion with known drift, the question of conditioning the process to reach a given endpoint or desired rare event is definitively answered by Doob's h-transform. However, the naive estimation of this transform is infeasible, as it requires simulating sufficiently many forward trajectories to estimate rare event probabilities. In this work, we propose a variational formulation of Doob's h-transform as an optimization problem over trajectories between a given initial point and the desired ending point. To solve this optimization, we propose a simulation-free training objective with a model parameterization that imposes the desired boundary conditions by design. Our approach significantly reduces the search space over trajectories and avoids expensive trajectory simulation and inefficient importance sampling estimators which are required in existing methods. We demonstrate the ability of our method to find feasible transition paths on real-world molecular simulation and protein folding tasks.
Related papers
- Hyperparameter Trajectory Inference with Conditional Lagrangian Optimal Transport [51.56484100374058]
Post-deployment, user preferences can evolve, making initial settings undesirable.<n>We learn, from observed data, how a NN's conditional output distribution changes with its hyper parameters.<n>We construct a surrogate model that approximates the NN at unobserved hyper parameters.
arXiv Detail & Related papers (2026-03-02T11:55:02Z) - HardFlow: Hard-Constrained Sampling for Flow-Matching Models via Trajectory Optimization [4.249024052507976]
We introduce a novel framework that reformulates hard-constrained sampling as a trajectory optimization problem.<n>Our key insight is to leverage numerical optimal control to steer the sampling trajectory so that constraints are satisfied precisely at the terminal time.<n>Our algorithm, which we name $textitHardFlow$, substantially outperforms existing methods in both constraint satisfaction and sample quality.
arXiv Detail & Related papers (2025-11-11T16:33:57Z) - Simulation-Free Differential Dynamics through Neural Conservation Laws [22.4113724471297]
We present a novel simulation-free framework for training continuous-time diffusion processes over very general objective functions.<n>We propose a coupled parameterization which jointly models a time-dependent density function, or probability path, and the dynamics of a diffusion process that generates this probability path.
arXiv Detail & Related papers (2025-06-23T13:04:23Z) - Simulating the Schwinger Model with a Regularized Variational Quantum Imaginary Time Evolution [9.615119990353087]
The Schwinger model serves as a benchmark for testing non-perturbative algorithms in quantum chromodynamics.
classical algorithms encounter challenges when simulating the Schwinger model, such as the "sign problem"
arXiv Detail & Related papers (2024-09-20T13:51:48Z) - Dynamical Measure Transport and Neural PDE Solvers for Sampling [77.38204731939273]
We tackle the task of sampling from a probability density as transporting a tractable density function to the target.
We employ physics-informed neural networks (PINNs) to approximate the respective partial differential equations (PDEs) solutions.
PINNs allow for simulation- and discretization-free optimization and can be trained very efficiently.
arXiv Detail & Related papers (2024-07-10T17:39:50Z) - Diffusion Generative Flow Samplers: Improving learning signals through
partial trajectory optimization [87.21285093582446]
Diffusion Generative Flow Samplers (DGFS) is a sampling-based framework where the learning process can be tractably broken down into short partial trajectory segments.
Our method takes inspiration from the theory developed for generative flow networks (GFlowNets)
arXiv Detail & Related papers (2023-10-04T09:39:05Z) - Surrogate Neural Networks for Efficient Simulation-based Trajectory
Planning Optimization [28.292234483886947]
This paper presents a novel methodology that uses surrogate models in the form of neural networks to reduce the computation time of simulation-based optimization of a reference trajectory.
We find a 74% better-performing reference trajectory compared to nominal, and the numerical results clearly show a substantial reduction in computation time for designing future trajectories.
arXiv Detail & Related papers (2023-03-30T15:44:30Z) - Particle-Based Score Estimation for State Space Model Learning in
Autonomous Driving [62.053071723903834]
Multi-object state estimation is a fundamental problem for robotic applications.
We consider learning maximum-likelihood parameters using particle methods.
We apply our method to real data collected from autonomous vehicles.
arXiv Detail & Related papers (2022-12-14T01:21:05Z) - Entropic Neural Optimal Transport via Diffusion Processes [105.34822201378763]
We propose a novel neural algorithm for the fundamental problem of computing the entropic optimal transport (EOT) plan between continuous probability distributions.
Our algorithm is based on the saddle point reformulation of the dynamic version of EOT which is known as the Schr"odinger Bridge problem.
In contrast to the prior methods for large-scale EOT, our algorithm is end-to-end and consists of a single learning step.
arXiv Detail & Related papers (2022-11-02T14:35:13Z) - Neural Motion Fields: Encoding Grasp Trajectories as Implicit Value
Functions [65.84090965167535]
We present Neural Motion Fields, a novel object representation which encodes both object point clouds and the relative task trajectories as an implicit value function parameterized by a neural network.
This object-centric representation models a continuous distribution over the SE(3) space and allows us to perform grasping reactively by leveraging sampling-based MPC to optimize this value function.
arXiv Detail & Related papers (2022-06-29T18:47:05Z) - Neural Lagrangian Schr\"odinger bridge [25.157282476221482]
We study population dynamics using continuous normalizing flows (CNFs) and dynamic optimal transport.
We formulate the Lagrangian Schr"odinger bridge (LSB) problem and propose to solve it using neural SDE with regularization.
Our experiments show that our solution to the LSB problem can approximate the dynamics at the population level.
arXiv Detail & Related papers (2022-04-11T03:32:17Z) - Stochastic Trajectory Prediction via Motion Indeterminacy Diffusion [88.45326906116165]
We present a new framework to formulate the trajectory prediction task as a reverse process of motion indeterminacy diffusion (MID)
We encode the history behavior information and the social interactions as a state embedding and devise a Transformer-based diffusion model to capture the temporal dependencies of trajectories.
Experiments on the human trajectory prediction benchmarks including the Stanford Drone and ETH/UCY datasets demonstrate the superiority of our method.
arXiv Detail & Related papers (2022-03-25T16:59:08Z) - Error-Correcting Neural Networks for Semi-Lagrangian Advection in the
Level-Set Method [0.0]
We present a machine learning framework that blends image super-resolution technologies with scalar transport in the level-set method.
We investigate whether we can compute on-the-fly data-driven corrections to minimize numerical viscosity in the coarse-mesh evolution of an interface.
arXiv Detail & Related papers (2021-10-22T06:36:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.