Learning by example: fast reliability-aware seismic imaging with
normalizing flows
- URL: http://arxiv.org/abs/2104.06255v1
- Date: Tue, 13 Apr 2021 15:13:45 GMT
- Title: Learning by example: fast reliability-aware seismic imaging with
normalizing flows
- Authors: Ali Siahkoohi and Felix J. Herrmann
- Abstract summary: We train a normalizing flow (NF) capable of cheaply sampling the posterior distribution given previously unseen seismic data from neighboring surveys.
We use these samples to compute a high-fidelity image including a first assessment of the image's reliability.
- Score: 0.76146285961466
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Uncertainty quantification provides quantitative measures on the reliability
of candidate solutions of ill-posed inverse problems. Due to their sequential
nature, Monte Carlo sampling methods require large numbers of sampling steps
for accurate Bayesian inference and are often computationally infeasible for
large-scale inverse problems, such as seismic imaging. Our main contribution is
a data-driven variational inference approach where we train a normalizing flow
(NF), a type of invertible neural net, capable of cheaply sampling the
posterior distribution given previously unseen seismic data from neighboring
surveys. To arrive at this result, we train the NF on pairs of low- and
high-fidelity migrated images. In our numerical example, we obtain
high-fidelity images from the Parihaka dataset and low-fidelity images are
derived from these images through the process of demigration, followed by
adding noise and migration. During inference, given shot records from a new
neighboring seismic survey, we first compute the reverse-time migration image.
Next, by feeding this low-fidelity migrated image to the NF we gain access to
samples from the posterior distribution virtually for free. We use these
samples to compute a high-fidelity image including a first assessment of the
image's reliability. To our knowledge, this is the first attempt to train a
conditional network on what we know from neighboring images to improve the
current image and assess its reliability.
Related papers
- Enhancing Diffusion Posterior Sampling for Inverse Problems by Integrating Crafted Measurements [45.70011319850862]
Diffusion models have emerged as a powerful foundation model for visual generation.
Current posterior sampling based methods take the measurement into the posterior sampling to infer the distribution of the target data.
We show that high-frequency information can be prematurely introduced during the early stages, which could induce larger posterior estimate errors.
We propose a novel diffusion posterior sampling method DPS-CM, which incorporates a Crafted Measurement.
arXiv Detail & Related papers (2024-11-15T00:06:57Z) - Ambient Diffusion Posterior Sampling: Solving Inverse Problems with
Diffusion Models trained on Corrupted Data [56.81246107125692]
Ambient Diffusion Posterior Sampling (A-DPS) is a generative model pre-trained on one type of corruption.
We show that A-DPS can sometimes outperform models trained on clean data for several image restoration tasks in both speed and performance.
We extend the Ambient Diffusion framework to train MRI models with access only to Fourier subsampled multi-coil MRI measurements.
arXiv Detail & Related papers (2024-03-13T17:28:20Z) - Deep Equilibrium Diffusion Restoration with Parallel Sampling [120.15039525209106]
Diffusion model-based image restoration (IR) aims to use diffusion models to recover high-quality (HQ) images from degraded images, achieving promising performance.
Most existing methods need long serial sampling chains to restore HQ images step-by-step, resulting in expensive sampling time and high computation costs.
In this work, we aim to rethink the diffusion model-based IR models through a different perspective, i.e., a deep equilibrium (DEQ) fixed point system, called DeqIR.
arXiv Detail & Related papers (2023-11-20T08:27:56Z) - Masked Images Are Counterfactual Samples for Robust Fine-tuning [77.82348472169335]
Fine-tuning deep learning models can lead to a trade-off between in-distribution (ID) performance and out-of-distribution (OOD) robustness.
We propose a novel fine-tuning method, which uses masked images as counterfactual samples that help improve the robustness of the fine-tuning model.
arXiv Detail & Related papers (2023-03-06T11:51:28Z) - Posterior samples of source galaxies in strong gravitational lenses with
score-based priors [107.52670032376555]
We use a score-based model to encode the prior for the inference of undistorted images of background galaxies.
We show how the balance between the likelihood and the prior meet our expectations in an experiment with out-of-distribution data.
arXiv Detail & Related papers (2022-11-07T19:00:42Z) - Conditional Variational Autoencoder for Learned Image Reconstruction [5.487951901731039]
We develop a novel framework that approximates the posterior distribution of the unknown image at each query observation.
It handles implicit noise models and priors, it incorporates the data formation process (i.e., the forward operator), and the learned reconstructive properties are transferable between different datasets.
arXiv Detail & Related papers (2021-10-22T10:02:48Z) - Score-based diffusion models for accelerated MRI [35.3148116010546]
We introduce a way to sample data from a conditional distribution given the measurements, such that the model can be readily used for solving inverse problems in imaging.
Our model requires magnitude images only for training, and yet is able to reconstruct complex-valued data, and even extends to parallel imaging.
arXiv Detail & Related papers (2021-10-08T08:42:03Z) - Learning Energy-Based Models by Diffusion Recovery Likelihood [61.069760183331745]
We present a diffusion recovery likelihood method to tractably learn and sample from a sequence of energy-based models.
After training, synthesized images can be generated by the sampling process that initializes from Gaussian white noise distribution.
On unconditional CIFAR-10 our method achieves FID 9.58 and inception score 8.30, superior to the majority of GANs.
arXiv Detail & Related papers (2020-12-15T07:09:02Z) - Salvage Reusable Samples from Noisy Data for Robust Learning [70.48919625304]
We propose a reusable sample selection and correction approach, termed as CRSSC, for coping with label noise in training deep FG models with web images.
Our key idea is to additionally identify and correct reusable samples, and then leverage them together with clean examples to update the networks.
arXiv Detail & Related papers (2020-08-06T02:07:21Z) - Uncertainty quantification in imaging and automatic horizon tracking: a
Bayesian deep-prior based approach [0.5156484100374059]
Uncertainty quantification (UQ) deals with a probabilistic description of the solution nonuniqueness and data noise sensitivity.
In this paper, we focus on how UQ trickles down to horizon tracking for the determination of stratigraphic models.
arXiv Detail & Related papers (2020-04-01T04:26:33Z) - A deep-learning based Bayesian approach to seismic imaging and
uncertainty quantification [0.4588028371034407]
Uncertainty is essential when dealing with ill-conditioned inverse problems.
It is often not possible to formulate a prior distribution that precisely encodes our prior knowledge about the unknown.
We propose to use the functional form of a randomly convolutional neural network as an implicit structured prior.
arXiv Detail & Related papers (2020-01-13T23:46:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.