Quantized Compressed Sensing with Score-Based Generative Models
- URL: http://arxiv.org/abs/2211.13006v1
- Date: Wed, 2 Nov 2022 15:19:07 GMT
- Title: Quantized Compressed Sensing with Score-Based Generative Models
- Authors: Xiangming Meng and Yoshiyuki Kabashima
- Abstract summary: We propose an unsupervised data-driven approach called quantized compressed sensing with SGM (QCS-SGM)
The proposed QCS-SGM significantly outperforms existing state-of-the-art algorithms by a large margin for both in-distribution and out-of-distribution samples.
As a posterior sampling method, QCS-SGM can be easily used to obtain confidence intervals or uncertainty estimates of the reconstructed results.
- Score: 6.066320781596792
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We consider the general problem of recovering a high-dimensional signal from
noisy quantized measurements. Quantization, especially coarse quantization such
as one-bit sign measurements, leads to severe information loss and thus a good
prior knowledge of the unknown signal is helpful for accurate recovery.
Motivated by the power of score-based generative models (SGM, also known as
diffusion models) in capturing the rich structure of natural signals beyond
simple sparsity, we propose an unsupervised data-driven approach called
quantized compressed sensing with SGM (QCS-SGM), where the prior distribution
is modeled by a pre-trained SGM. To perform posterior sampling, an annealed
pseudo-likelihood score called noise perturbed pseudo-likelihood score is
introduced and combined with the prior score of SGM. The proposed QCS-SGM
applies to arbitrary number of quantization bits. Experiments on a variety of
baseline datasets demonstrate that the proposed QCS-SGM significantly
outperforms existing state-of-the-art algorithms by a large margin for both
in-distribution and out-of-distribution samples. Moreover, as a posterior
sampling method, QCS-SGM can be easily used to obtain confidence intervals or
uncertainty estimates of the reconstructed results. The code for the
experiments will be open-sourced at https://github.com/mengxiangming/QCS-SGM
upon future publication.
Related papers
- A Good Score Does not Lead to A Good Generative Model [14.752242187781107]
Score-based Generative Models (SGMs) is one leading method in generative modeling.
We show that SGMs can generate samples from a distribution that is close to the ground-truth if the underlying score function is learned well.
arXiv Detail & Related papers (2024-01-10T00:17:36Z) - Simulation-Based Inference with Quantile Regression [0.0]
We present Neural Quantile Estimation (NQE), a novel Simulation-Based Inference ( SBI) method based on conditional quantile regression.
NQE autoregressively learns individual one dimensional quantiles for each posterior dimension, conditioned on the data and previous posterior dimensions.
We demonstrate NQE achieves state-of-the-art performance on a variety of benchmark problems.
arXiv Detail & Related papers (2024-01-04T18:53:50Z) - Preconditioned Score-based Generative Models [49.88840603798831]
An intuitive acceleration method is to reduce the sampling iterations which however causes severe performance degradation.
We propose a model-agnostic bfem preconditioned diffusion sampling (PDS) method that leverages matrix preconditioning to alleviate the aforementioned problem.
PDS alters the sampling process of a vanilla SGM at marginal extra computation cost, and without model retraining.
arXiv Detail & Related papers (2023-02-13T16:30:53Z) - QCM-SGM+: Improved Quantized Compressed Sensing With Score-Based
Generative Models [17.49551570305112]
In practical compressed sensing (CS), the obtained measurements typically necessitate quantization to a limited number of bits prior to transmission or storage.
We introduce an advanced variant of QCS-SGM, termed QCS-SGM+, capable of handling general matrices effectively.
arXiv Detail & Related papers (2023-02-02T07:36:58Z) - Qimera: Data-free Quantization with Synthetic Boundary Supporting
Samples [8.975667614727652]
We propose Qimera, a method that uses superposed latent embeddings to generate synthetic boundary supporting samples.
The experimental results show that Qimera achieves state-of-the-art performances for various settings on data-free quantization.
arXiv Detail & Related papers (2021-11-04T04:52:50Z) - Robust Compressed Sensing MRI with Deep Generative Priors [84.69062247243953]
We present the first successful application of the CSGM framework on clinical MRI data.
We train a generative prior on brain scans from the fastMRI dataset, and show that posterior sampling via Langevin dynamics achieves high quality reconstructions.
arXiv Detail & Related papers (2021-08-03T08:52:06Z) - Score-based Generative Modeling in Latent Space [93.8985523558869]
Score-based generative models (SGMs) have recently demonstrated impressive results in terms of both sample quality and distribution coverage.
Here, we propose the Latent Score-based Generative Model (LSGM), a novel approach that trains SGMs in a latent space.
Moving from data to latent space allows us to train more expressive generative models, apply SGMs to non-continuous data, and learn smoother SGMs in a smaller space.
arXiv Detail & Related papers (2021-06-10T17:26:35Z) - Continual Learning with Fully Probabilistic Models [70.3497683558609]
We present an approach for continual learning based on fully probabilistic (or generative) models of machine learning.
We propose a pseudo-rehearsal approach using a Gaussian Mixture Model (GMM) instance for both generator and classifier functionalities.
We show that GMR achieves state-of-the-art performance on common class-incremental learning problems at very competitive time and memory complexity.
arXiv Detail & Related papers (2021-04-19T12:26:26Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z) - Targeted stochastic gradient Markov chain Monte Carlo for hidden Markov models with rare latent states [48.705095800341944]
Markov chain Monte Carlo (MCMC) algorithms for hidden Markov models often rely on the forward-backward sampler.
This makes them computationally slow as the length of the time series increases, motivating the development of sub-sampling-based approaches.
We propose a targeted sub-sampling approach that over-samples observations corresponding to rare latent states when calculating the gradient of parameters associated with them.
arXiv Detail & Related papers (2018-10-31T17:44:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.