Sample Complexity Bounds for 1-bit Compressive Sensing and Binary Stable
Embeddings with Generative Priors
- URL: http://arxiv.org/abs/2002.01697v3
- Date: Mon, 22 Jun 2020 04:57:09 GMT
- Title: Sample Complexity Bounds for 1-bit Compressive Sensing and Binary Stable
Embeddings with Generative Priors
- Authors: Zhaoqiang Liu, Selwyn Gomes, Avtansh Tiwari, Jonathan Scarlett
- Abstract summary: Motivated by advances in compressive sensing with generative models, we study the problem of 1-bit compressive sensing with generative models.
We first consider noiseless 1-bit measurements, and provide sample complexity bounds for approximate recovery under i.i.d.Gaussian measurements.
We demonstrate that the Binary $epsilon$-Stable Embedding property, which characterizes the robustness of the reconstruction to measurement errors and noise, also holds for 1-bit compressive sensing with Lipschitz continuous generative models.
- Score: 52.06292503723978
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The goal of standard 1-bit compressive sensing is to accurately recover an
unknown sparse vector from binary-valued measurements, each indicating the sign
of a linear function of the vector. Motivated by recent advances in compressive
sensing with generative models, where a generative modeling assumption replaces
the usual sparsity assumption, we study the problem of 1-bit compressive
sensing with generative models. We first consider noiseless 1-bit measurements,
and provide sample complexity bounds for approximate recovery under
i.i.d.~Gaussian measurements and a Lipschitz continuous generative prior, as
well as a near-matching algorithm-independent lower bound. Moreover, we
demonstrate that the Binary $\epsilon$-Stable Embedding property, which
characterizes the robustness of the reconstruction to measurement errors and
noise, also holds for 1-bit compressive sensing with Lipschitz continuous
generative models with sufficiently many Gaussian measurements. In addition, we
apply our results to neural network generative models, and provide a
proof-of-concept numerical experiment demonstrating significant improvements
over sparsity-based approaches.
Related papers
- Outlier Detection Using Generative Models with Theoretical Performance
Guarantees [11.985270449383272]
We establish theoretical recovery guarantees for reconstruction of signals using generative models in the presence of outliers.
Our results are applicable to both linear generator neural networks and the nonlinear generator neural networks with an arbitrary number of layers.
arXiv Detail & Related papers (2023-10-16T01:25:34Z) - Restoration-Degradation Beyond Linear Diffusions: A Non-Asymptotic
Analysis For DDIM-Type Samplers [90.45898746733397]
We develop a framework for non-asymptotic analysis of deterministic samplers used for diffusion generative modeling.
We show that one step along the probability flow ODE can be expressed as two steps: 1) a restoration step that runs ascent on the conditional log-likelihood at some infinitesimally previous time, and 2) a degradation step that runs the forward process using noise pointing back towards the current gradient.
arXiv Detail & Related papers (2023-03-06T18:59:19Z) - Convergence of uncertainty estimates in Ensemble and Bayesian sparse
model discovery [4.446017969073817]
We show empirical success in terms of accuracy and robustness to noise with bootstrapping-based sequential thresholding least-squares estimator.
We show that this bootstrapping-based ensembling technique can perform a provably correct variable selection procedure with an exponential convergence rate of the error rate.
arXiv Detail & Related papers (2023-01-30T04:07:59Z) - Off-the-grid prediction and testing for linear combination of translated features [2.774897240515734]
We consider a model where a signal (discrete or continuous) is observed with an additive Gaussian noise process.
We extend previous prediction results for off-the-grid estimators by taking into account that the scale parameter may vary.
We propose a procedure to test whether the features of the observed signal belong to a given finite collection.
arXiv Detail & Related papers (2022-12-02T13:48:45Z) - Robust 1-bit Compressive Sensing with Partial Gaussian Circulant
Matrices and Generative Priors [54.936314353063494]
We provide recovery guarantees for a correlation-based optimization algorithm for robust 1-bit compressive sensing.
We make use of a practical iterative algorithm, and perform numerical experiments on image datasets to corroborate our results.
arXiv Detail & Related papers (2021-08-08T05:28:06Z) - Learning Mixtures of Low-Rank Models [89.39877968115833]
We study the problem of learning computational mixtures of low-rank models.
We develop an algorithm that is guaranteed to recover the unknown matrices with near-optimal sample.
In addition, the proposed algorithm is provably stable against random noise.
arXiv Detail & Related papers (2020-09-23T17:53:48Z) - One-Bit Compressed Sensing via One-Shot Hard Thresholding [7.594050968868919]
A problem of 1-bit compressed sensing is to estimate a sparse signal from a few binary measurements.
We present a novel and concise analysis that moves away from the widely used non-constrained notion of width.
arXiv Detail & Related papers (2020-07-07T17:28:03Z) - The Generalized Lasso with Nonlinear Observations and Generative Priors [63.541900026673055]
We make the assumption of sub-Gaussian measurements, which is satisfied by a wide range of measurement models.
We show that our result can be extended to the uniform recovery guarantee under the assumption of a so-called local embedding property.
arXiv Detail & Related papers (2020-06-22T16:43:35Z) - Robust Compressed Sensing using Generative Models [98.64228459705859]
In this paper we propose an algorithm inspired by the Median-of-Means (MOM)
Our algorithm guarantees recovery for heavy-tailed data, even in the presence of outliers.
arXiv Detail & Related papers (2020-06-16T19:07:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.