Semi-supervised Invertible DeepONets for Bayesian Inverse Problems
- URL: http://arxiv.org/abs/2209.02772v2
- Date: Thu, 8 Sep 2022 08:00:44 GMT
- Title: Semi-supervised Invertible DeepONets for Bayesian Inverse Problems
- Authors: Sebastian Kaltenbach, Paris Perdikaris, Phaedon-Stelios Koutsourelakis
- Abstract summary: DeepONets offer a powerful, data-driven tool for solving parametric PDEs by learning operators.
In this work, we employ physics-informed DeepONets in the context of high-dimensional, Bayesian inverse problems.
- Score: 8.594140167290098
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep Operator Networks (DeepONets) offer a powerful, data-driven tool for
solving parametric PDEs by learning operators, i.e. maps between
infinite-dimensional function spaces. In this work, we employ physics-informed
DeepONets in the context of high-dimensional, Bayesian inverse problems.
Traditional solution strategies necessitate an enormous, and frequently
infeasible, number of forward model solves, as well as the computation of
parametric derivatives. In order to enable efficient solutions, we extend
DeepONets by employing a realNVP architecture which yields an invertible and
differentiable map between the parametric input and the branch net output. This
allows us to construct accurate approximations of the full posterior which can
be readily adapted irrespective of the number of observations and the magnitude
of the observation noise. As a result, no additional forward solves are
required, nor is there any need for costly sampling procedures. We demonstrate
the efficacy and accuracy of the proposed methodology in the context of inverse
problems based on a anti-derivative, a reaction-diffusion and a Darcy-flow
equation.
Related papers
- Total Uncertainty Quantification in Inverse PDE Solutions Obtained with Reduced-Order Deep Learning Surrogate Models [50.90868087591973]
We propose an approximate Bayesian method for quantifying the total uncertainty in inverse PDE solutions obtained with machine learning surrogate models.
We test the proposed framework by comparing it with the iterative ensemble smoother and deep ensembling methods for a non-linear diffusion equation.
arXiv Detail & Related papers (2024-08-20T19:06:02Z) - Solving Poisson Equations using Neural Walk-on-Spheres [80.1675792181381]
We propose Neural Walk-on-Spheres (NWoS), a novel neural PDE solver for the efficient solution of high-dimensional Poisson equations.
We demonstrate the superiority of NWoS in accuracy, speed, and computational costs.
arXiv Detail & Related papers (2024-06-05T17:59:22Z) - ODE-DPS: ODE-based Diffusion Posterior Sampling for Inverse Problems in Partial Differential Equation [1.8356973269166506]
We introduce a novel unsupervised inversion methodology tailored for solving inverse problems arising from PDEs.
Our approach operates within the Bayesian inversion framework, treating the task of solving the posterior distribution as a conditional generation process.
To enhance the accuracy of inversion results, we propose an ODE-based Diffusion inversion algorithm.
arXiv Detail & Related papers (2024-04-21T00:57:13Z) - Reverse em-problem based on Bregman divergence and its application to classical and quantum information theory [53.64687146666141]
Recent paper introduced an analytical method for calculating the channel capacity without the need for iteration.
We turn our attention to the reverse em-problem, proposed by Toyota.
We derive a non-iterative formula for the reverse em-problem.
arXiv Detail & Related papers (2024-03-14T10:20:28Z) - Flow-based Distributionally Robust Optimization [23.232731771848883]
We present a framework, called $textttFlowDRO$, for solving flow-based distributionally robust optimization (DRO) problems with Wasserstein uncertainty sets.
We aim to find continuous worst-case distribution (also called the Least Favorable Distribution, LFD) and sample from it.
We demonstrate its usage in adversarial learning, distributionally robust hypothesis testing, and a new mechanism for data-driven distribution perturbation differential privacy.
arXiv Detail & Related papers (2023-10-30T03:53:31Z) - Multi-Grid Tensorized Fourier Neural Operator for High-Resolution PDEs [93.82811501035569]
We introduce a new data efficient and highly parallelizable operator learning approach with reduced memory requirement and better generalization.
MG-TFNO scales to large resolutions by leveraging local and global structures of full-scale, real-world phenomena.
We demonstrate superior performance on the turbulent Navier-Stokes equations where we achieve less than half the error with over 150x compression.
arXiv Detail & Related papers (2023-09-29T20:18:52Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - Neural Inverse Operators for Solving PDE Inverse Problems [5.735035463793008]
We propose a novel architecture termed Neural Inverse Operators (NIOs) to solve these PDE inverse problems.
A variety of experiments are presented to demonstrate that NIOs significantly outperform baselines and solve PDE inverse problems robustly, accurately and are several orders of magnitude faster than existing direct and PDE-constrained optimization methods.
arXiv Detail & Related papers (2023-01-26T15:12:58Z) - Accelerated replica exchange stochastic gradient Langevin diffusion
enhanced Bayesian DeepONet for solving noisy parametric PDEs [7.337247167823921]
We propose a training framework for replica-exchange Langevin diffusion that exploits the neural network architecture of DeepONets.
We show that the proposed framework's exploration and exploitation capabilities enable improved training convergence for DeepONets in noisy scenarios.
We also show that replica-exchange Langeving Diffusion also improves the DeepONet's mean prediction accuracy in noisy scenarios.
arXiv Detail & Related papers (2021-11-03T19:23:59Z) - A Deep Learning approach to Reduced Order Modelling of Parameter
Dependent Partial Differential Equations [0.2148535041822524]
We develop a constructive approach based on Deep Neural Networks for the efficient approximation of the parameter-to-solution map.
In particular, we consider parametrized advection-diffusion PDEs, and we test the methodology in the presence of strong transport fields.
arXiv Detail & Related papers (2021-03-10T17:01:42Z) - Solving Sparse Linear Inverse Problems in Communication Systems: A Deep
Learning Approach With Adaptive Depth [51.40441097625201]
We propose an end-to-end trainable deep learning architecture for sparse signal recovery problems.
The proposed method learns how many layers to execute to emit an output, and the network depth is dynamically adjusted for each task in the inference phase.
arXiv Detail & Related papers (2020-10-29T06:32:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.