Diffusion Model-Based Posterior Sampling in Full Waveform Inversion
- URL: http://arxiv.org/abs/2512.12797v1
- Date: Sun, 14 Dec 2025 18:34:12 GMT
- Title: Diffusion Model-Based Posterior Sampling in Full Waveform Inversion
- Authors: Mohammad H. Taufik, Tariq Alkhalifah,
- Abstract summary: posterior sampling directly on observed seismic shot records is rarely practical at the field scale.<n>Our approach couples diffusion-based posterior sampling with simultaneous-source waveform inversion data.<n>Our method achieves lower model error and better data fit at a substantially reduced computational cost.
- Score: 3.2800968305157205
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Bayesian full waveform inversion (FWI) offers uncertainty-aware subsurface models; however, posterior sampling directly on observed seismic shot records is rarely practical at the field scale because each sample requires numerous wave-equation solves. We aim to make such sampling feasible for large surveys while preserving calibration, that is, high uncertainty in less illuminated areas. Our approach couples diffusion-based posterior sampling with simultaneous-source FWI data. At each diffusion noise level, a network predicts a clean velocity model. We then apply a stochastic refinement step in model space using Langevin dynamics under the wave-equation likelihood and reintroduce noise to decouple successive levels before proceeding. Simultaneous-source batches reduce forward and adjoint solves approximately in proportion to the supergather size, while an unconditional diffusion prior trained on velocity patches and volumes helps suppress source-related numerical artefacts. We evaluate the method on three 2D synthetic datasets (SEG/EAGE Overthrust, SEG/EAGE Salt, SEAM Arid), a 2D field line, and a 3D upscaling study. Relative to a particle-based variational baseline, namely Stein variational gradient descent without a learned prior and with single-source (non-simultaneous-source) FWI, our sampler achieves lower model error and better data fit at a substantially reduced computational cost. By aligning encoded-shot likelihoods with diffusion-based sampling and exploiting straightforward parallelization over samples and source batches, the method provides a practical path to calibrated posterior inference on observed shot records that scales to large 2D and 3D problems.
Related papers
- Manifold-Aligned Generative Transport [11.857867207010981]
We propose a flow-like generator that learns a one-shot, manifold-aligned transport from a low-dimensional base distribution to the data space.<n>We empirically improve fidelity and manifold concentration across synthetic and benchmark datasets while sampling substantially faster than diffusion models.
arXiv Detail & Related papers (2026-02-23T08:42:40Z) - Rethinking Refinement: Correcting Generative Bias without Noise Injection [7.28668585578288]
Generative models, including diffusion and flow-based models, often exhibit systematic biases that degrade sample quality.<n>We show that effective bias correction can be achieved as a post-hoc procedure, without noise injection or multi-step resampling.
arXiv Detail & Related papers (2026-01-29T02:34:08Z) - Matching the Optimal Denoiser in Point Cloud Diffusion with (Improved) Rotational Alignment [5.8069334875117775]
Training a diffusion model consists of learning how to denoise noisy samples at different noise levels.<n>We show that the optimal denoiser can be expressed in terms of a matrix Fisher distribution over $SO(3)$.<n>We build on this perspective to derive better approximators to the optimal denoiser in the limit of small noise.
arXiv Detail & Related papers (2025-10-02T05:55:22Z) - Score Distillation of Flow Matching Models [67.86066177182046]
We extend Score identity Distillation (SiD) to pretrained text-to-image flow-matching models.<n>SiD works out of the box across these models, in both data-free and data-aided settings.<n>This provides the first systematic evidence that score distillation applies broadly to text-to-image flow matching models.
arXiv Detail & Related papers (2025-09-29T17:45:48Z) - Guided Diffusion Sampling on Function Spaces with Applications to PDEs [112.09025802445329]
We propose a general framework for conditional sampling in PDE-based inverse problems.<n>This is accomplished by a function-space diffusion model and plug-and-play guidance for conditioning.<n>Our method achieves an average 32% accuracy improvement over state-of-the-art fixed-resolution diffusion baselines.
arXiv Detail & Related papers (2025-05-22T17:58:12Z) - Enhancing Diffusion Posterior Sampling for Inverse Problems by Integrating Crafted Measurements [46.03835001280626]
Current posterior sampling-based methods take the measurement into the posterior sampling to infer the distribution of the target data.<n>We show that high-frequency information can be prematurely introduced during the early stages, which could induce larger posterior estimate errors.<n>We propose a novel diffusion posterior sampling method DPS-CM, which incorporates a Crafted Measurement (i.e., noisy measurement crafted by a reverse denoising process) to form the posterior estimate.
arXiv Detail & Related papers (2024-11-15T00:06:57Z) - Diffusion Prior-Based Amortized Variational Inference for Noisy Inverse Problems [12.482127049881026]
We propose a novel approach to solve inverse problems with a diffusion prior from an amortized variational inference perspective.
Our amortized inference learns a function that directly maps measurements to the implicit posterior distributions of corresponding clean data, enabling a single-step posterior sampling even for unseen measurements.
arXiv Detail & Related papers (2024-07-23T02:14:18Z) - Boosting Diffusion Models with Moving Average Sampling in Frequency Domain [101.43824674873508]
Diffusion models rely on the current sample to denoise the next one, possibly resulting in denoising instability.
In this paper, we reinterpret the iterative denoising process as model optimization and leverage a moving average mechanism to ensemble all the prior samples.
We name the complete approach "Moving Average Sampling in Frequency domain (MASF)"
arXiv Detail & Related papers (2024-03-26T16:57:55Z) - Ambient Diffusion Posterior Sampling: Solving Inverse Problems with Diffusion Models Trained on Corrupted Data [54.09959775518994]
We provide a framework for solving inverse problems with diffusion models learned from linearly corrupted data.<n>We train diffusion models for MRI with access only to subsampled multi-coil measurements at acceleration factors R= 2,4,6,8.<n>For MRI reconstruction in high acceleration regimes, we observe that A-DPS models trained on subsampled data are better suited to solving inverse problems than models trained on fully sampled data.
arXiv Detail & Related papers (2024-03-13T17:28:20Z) - Consistent3D: Towards Consistent High-Fidelity Text-to-3D Generation with Deterministic Sampling Prior [87.55592645191122]
Score distillation sampling (SDS) and its variants have greatly boosted the development of text-to-3D generation, but are vulnerable to geometry collapse and poor textures yet.
We propose a novel and effective "Consistent3D" method that explores the ODE deterministic sampling prior for text-to-3D generation.
Experimental results show the efficacy of our Consistent3D in generating high-fidelity and diverse 3D objects and large-scale scenes.
arXiv Detail & Related papers (2024-01-17T08:32:07Z) - Adaptive Multi-step Refinement Network for Robust Point Cloud Registration [82.64560249066734]
Point Cloud Registration estimates the relative rigid transformation between two point clouds of the same scene.<n>We propose an adaptive multi-step refinement network that refines the registration quality at each step by leveraging the information from the preceding step.<n>Our method achieves state-of-the-art performance on both the 3DMatch/3DLoMatch and KITTI benchmarks.
arXiv Detail & Related papers (2023-12-05T18:59:41Z) - Decomposed Diffusion Sampler for Accelerating Large-Scale Inverse
Problems [64.29491112653905]
We propose a novel and efficient diffusion sampling strategy that synergistically combines the diffusion sampling and Krylov subspace methods.
Specifically, we prove that if tangent space at a denoised sample by Tweedie's formula forms a Krylov subspace, then the CG with the denoised data ensures the data consistency update to remain in the tangent space.
Our proposed method achieves more than 80 times faster inference time than the previous state-of-the-art method.
arXiv Detail & Related papers (2023-03-10T07:42:49Z) - Diffusion Model Based Posterior Sampling for Noisy Linear Inverse Problems [14.809545109705256]
This paper presents a fast and effective solution by proposing a simple closed-form approximation to the likelihood score.
For both diffusion and flow-based models, extensive experiments are conducted on various noisy linear inverse problems.
Our method demonstrates highly competitive or even better reconstruction performances while being significantly faster than all the baseline methods.
arXiv Detail & Related papers (2022-11-20T01:09:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.