Accurate, provable, and fast nonlinear tomographic reconstruction: A variational inequality approach
- URL: http://arxiv.org/abs/2503.19925v1
- Date: Thu, 13 Mar 2025 19:04:34 GMT
- Title: Accurate, provable, and fast nonlinear tomographic reconstruction: A variational inequality approach
- Authors: Mengqi Lou, Kabir Aladin Verchand, Sara Fridovich-Keil, Ashwin Pananjady,
- Abstract summary: We develop a simple iterative algorithm for single-material reconstruction, which we call EXACT (EXtragradient Algorithm for Computed Tomography)<n>We prove guarantees on the statistical and computational performance of EXACT under practical assumptions on the measurement process.<n>We apply our EXACT algorithm to a CT phantom image recovery task and show that it often requires fewer X-ray projection exposures, lower source intensity, and less time to achieve similar reconstruction quality to existing methods.
- Score: 9.378079414376842
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We consider the problem of signal reconstruction for computed tomography (CT) under a nonlinear forward model that accounts for exponential signal attenuation, a polychromatic X-ray source, general measurement noise (e.g. Poisson shot noise), and observations acquired over multiple wavelength windows. We develop a simple iterative algorithm for single-material reconstruction, which we call EXACT (EXtragradient Algorithm for Computed Tomography), based on formulating our estimate as the fixed point of a monotone variational inequality. We prove guarantees on the statistical and computational performance of EXACT under practical assumptions on the measurement process. We also consider a recently introduced variant of this model with Gaussian measurements, and present sample and iteration complexity bounds for EXACT that improve upon those of existing algorithms. We apply our EXACT algorithm to a CT phantom image recovery task and show that it often requires fewer X-ray projection exposures, lower source intensity, and less computation time to achieve similar reconstruction quality to existing methods.
Related papers
- QN-Mixer: A Quasi-Newton MLP-Mixer Model for Sparse-View CT Reconstruction [0.0]
We introduce QN-Mixer, an algorithm based on the quasi-Newton approach.
Incept-Mixer is an efficient neural architecture that serves as a non-local regularization term.
Our approach intelligently downsamples information, significantly reducing computational requirements.
arXiv Detail & Related papers (2024-02-28T00:20:25Z) - Low-resolution Prior Equilibrium Network for CT Reconstruction [3.5639148953570836]
We present a novel deep learning-based CT reconstruction model, where the low-resolution image is introduced to obtain an effective regularization term for improving the networks robustness.
Experimental results on both sparse-view and limited-angle reconstruction problems are provided, demonstrating that our end-to-end low-resolution prior equilibrium model outperforms other state-of-the-art methods in terms of noise reduction, contrast-to-noise ratio, and preservation of edge details.
arXiv Detail & Related papers (2024-01-28T13:59:58Z) - Solving Linear Inverse Problems Provably via Posterior Sampling with
Latent Diffusion Models [98.95988351420334]
We present the first framework to solve linear inverse problems leveraging pre-trained latent diffusion models.
We theoretically analyze our algorithm showing provable sample recovery in a linear model setting.
arXiv Detail & Related papers (2023-07-02T17:21:30Z) - An Optimization-based Deep Equilibrium Model for Hyperspectral Image
Deconvolution with Convergence Guarantees [71.57324258813675]
We propose a novel methodology for addressing the hyperspectral image deconvolution problem.
A new optimization problem is formulated, leveraging a learnable regularizer in the form of a neural network.
The derived iterative solver is then expressed as a fixed-point calculation problem within the Deep Equilibrium framework.
arXiv Detail & Related papers (2023-06-10T08:25:16Z) - Optimal Algorithms for the Inhomogeneous Spiked Wigner Model [89.1371983413931]
We derive an approximate message-passing algorithm (AMP) for the inhomogeneous problem.
We identify in particular the existence of a statistical-to-computational gap where known algorithms require a signal-to-noise ratio bigger than the information-theoretic threshold to perform better than random.
arXiv Detail & Related papers (2023-02-13T19:57:17Z) - Amortized Bayesian Inference of GISAXS Data with Normalizing Flows [0.10752246796855561]
We propose a simulation-based framework that combines variational auto-encoders and normalizing flows to estimate the posterior distribution of object parameters.
We demonstrate that our method reduces the inference cost by orders of magnitude while producing consistent results with ABC.
arXiv Detail & Related papers (2022-10-04T12:09:57Z) - Denoising Diffusion Restoration Models [110.1244240726802]
Denoising Diffusion Restoration Models (DDRM) is an efficient, unsupervised posterior sampling method.
We demonstrate DDRM's versatility on several image datasets for super-resolution, deblurring, inpainting, and colorization.
arXiv Detail & Related papers (2022-01-27T20:19:07Z) - Nonparametric posterior learning for emission tomography with multimodal
data [1.6500749121196991]
We adapt the recently proposed nonparametric posterior learning technique to the context of Poisson-type data in emission tomography.
We derive sampling algorithms which are trivially parallelizable, scalable and very easy to implement.
We show theoretically and numerically that such data augmentation significantly increases mixing times for the Markov chain.
arXiv Detail & Related papers (2021-07-29T12:43:02Z) - Regularization by Denoising Sub-sampled Newton Method for Spectral CT
Multi-Material Decomposition [78.37855832568569]
We propose to solve a model-based maximum-a-posterior problem to reconstruct multi-materials images with application to spectral CT.
In particular, we propose to solve a regularized optimization problem based on a plug-in image-denoising function.
We show numerical and experimental results for spectral CT materials decomposition.
arXiv Detail & Related papers (2021-03-25T15:20:10Z) - Deep Variational Network Toward Blind Image Restoration [60.45350399661175]
Blind image restoration is a common yet challenging problem in computer vision.
We propose a novel blind image restoration method, aiming to integrate both the advantages of them.
Experiments on two typical blind IR tasks, namely image denoising and super-resolution, demonstrate that the proposed method achieves superior performance over current state-of-the-arts.
arXiv Detail & Related papers (2020-08-25T03:30:53Z) - Learned convex regularizers for inverse problems [3.294199808987679]
We propose to learn a data-adaptive input- neural network (ICNN) as a regularizer for inverse problems.
We prove the existence of a sub-gradient-based algorithm that leads to a monotonically decreasing error in the parameter space with iterations.
We show that the proposed convex regularizer is at least competitive with and sometimes superior to state-of-the-art data-driven techniques for inverse problems.
arXiv Detail & Related papers (2020-08-06T18:58:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.