Wasserstein-based Projections with Applications to Inverse Problems
- URL: http://arxiv.org/abs/2008.02200v3
- Date: Wed, 14 Apr 2021 16:25:43 GMT
- Title: Wasserstein-based Projections with Applications to Inverse Problems
- Authors: Howard Heaton, Samy Wu Fung, Alex Tong Lin, Stanley Osher, Wotao Yin
- Abstract summary: In problems consist of recovering a signal gap from a collection of noisy measurements.
Recent Plug-and-Play works propose replacing an operator for analytic regularization in optimization methods by a data-driven denoiser.
We present a new algorithm that takes samples from the manifold of true data as input and outputs an approximation of the projection operator onto this manifold.
- Score: 25.517805029337254
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Inverse problems consist of recovering a signal from a collection of noisy
measurements. These are typically cast as optimization problems, with classic
approaches using a data fidelity term and an analytic regularizer that
stabilizes recovery. Recent Plug-and-Play (PnP) works propose replacing the
operator for analytic regularization in optimization methods by a data-driven
denoiser. These schemes obtain state of the art results, but at the cost of
limited theoretical guarantees. To bridge this gap, we present a new algorithm
that takes samples from the manifold of true data as input and outputs an
approximation of the projection operator onto this manifold. Under standard
assumptions, we prove this algorithm generates a learned operator, called
Wasserstein-based projection (WP), that approximates the true projection with
high probability. Thus, WPs can be inserted into optimization methods in the
same manner as PnP, but now with theoretical guarantees. Provided numerical
examples show WPs obtain state of the art results for unsupervised PnP signal
recovery.
Related papers
- Efficient Learning of POMDPs with Known Observation Model in Average-Reward Setting [56.92178753201331]
We propose the Observation-Aware Spectral (OAS) estimation technique, which enables the POMDP parameters to be learned from samples collected using a belief-based policy.
We show the consistency of the OAS procedure, and we prove a regret guarantee of order $mathcalO(sqrtT log(T)$ for the proposed OAS-UCRL algorithm.
arXiv Detail & Related papers (2024-10-02T08:46:34Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Learning Unnormalized Statistical Models via Compositional Optimization [73.30514599338407]
Noise-contrastive estimation(NCE) has been proposed by formulating the objective as the logistic loss of the real data and the artificial noise.
In this paper, we study it a direct approach for optimizing the negative log-likelihood of unnormalized models.
arXiv Detail & Related papers (2023-06-13T01:18:16Z) - Provably Efficient Bayesian Optimization with Unknown Gaussian Process Hyperparameter Estimation [44.53678257757108]
We propose a new BO method that can sub-linearly converge to the objective function's global optimum.
Our method uses a multi-armed bandit technique (EXP3) to add random data points to the BO process.
We demonstrate empirically that our method outperforms existing approaches on various synthetic and real-world problems.
arXiv Detail & Related papers (2023-06-12T03:35:45Z) - Convergent Bregman Plug-and-Play Image Restoration for Poisson Inverse
Problems [8.673558396669806]
Plug-noise-and-Play (Play) methods are efficient iterative algorithms for solving illposed image inverse problems.
We propose two.
algorithms based on the Bregman Score gradient Denoise inverse problems.
arXiv Detail & Related papers (2023-06-06T07:36:47Z) - Plug-and-Play split Gibbs sampler: embedding deep generative priors in
Bayesian inference [12.91637880428221]
This paper introduces a plug-and-play sampling algorithm that leverages variable splitting to efficiently sample from a posterior distribution.
It divides the challenging task of posterior sampling into two simpler sampling problems.
Its performance is compared to recent state-of-the-art optimization and sampling methods.
arXiv Detail & Related papers (2023-04-21T17:17:51Z) - Optimal Algorithms for the Inhomogeneous Spiked Wigner Model [89.1371983413931]
We derive an approximate message-passing algorithm (AMP) for the inhomogeneous problem.
We identify in particular the existence of a statistical-to-computational gap where known algorithms require a signal-to-noise ratio bigger than the information-theoretic threshold to perform better than random.
arXiv Detail & Related papers (2023-02-13T19:57:17Z) - FaDIn: Fast Discretized Inference for Hawkes Processes with General
Parametric Kernels [82.53569355337586]
This work offers an efficient solution to temporal point processes inference using general parametric kernels with finite support.
The method's effectiveness is evaluated by modeling the occurrence of stimuli-induced patterns from brain signals recorded with magnetoencephalography (MEG)
Results show that the proposed approach leads to an improved estimation of pattern latency than the state-of-the-art.
arXiv Detail & Related papers (2022-10-10T12:35:02Z) - On Maximum-a-Posteriori estimation with Plug & Play priors and
stochastic gradient descent [13.168923974530307]
Methods to solve imaging problems usually combine an explicit data likelihood function with a prior that explicitly expected properties of the solution.
In a departure from explicit modelling, several recent works have proposed and studied the use of implicit priors defined by an image denoising algorithm.
arXiv Detail & Related papers (2022-01-16T20:50:08Z) - On the Convergence Rate of Projected Gradient Descent for a
Back-Projection based Objective [58.33065918353532]
We consider a back-projection based fidelity term as an alternative to the common least squares (LS)
We show that using the BP term, rather than the LS term, requires fewer iterations of optimization algorithms.
arXiv Detail & Related papers (2020-05-03T00:58:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.