Learning from small data sets: Patch-based regularizers in inverse
problems for image reconstruction
- URL: http://arxiv.org/abs/2312.16611v1
- Date: Wed, 27 Dec 2023 15:30:05 GMT
- Title: Learning from small data sets: Patch-based regularizers in inverse
problems for image reconstruction
- Authors: Moritz Piening, Fabian Altekr\"uger, Johannes Hertrich, Paul Hagemann,
Andrea Walther, Gabriele Steidl
- Abstract summary: Recent advances in machine learning require a huge amount of data and computer capacity to train the networks.
Our paper addresses the issue of learning from small data sets by taking patches of very few images into account.
We show how we can achieve uncertainty quantification by approximating the posterior using Langevin Monte Carlo methods.
- Score: 1.1650821883155187
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The solution of inverse problems is of fundamental interest in medical and
astronomical imaging, geophysics as well as engineering and life sciences.
Recent advances were made by using methods from machine learning, in particular
deep neural networks. Most of these methods require a huge amount of (paired)
data and computer capacity to train the networks, which often may not be
available. Our paper addresses the issue of learning from small data sets by
taking patches of very few images into account. We focus on the combination of
model-based and data-driven methods by approximating just the image prior, also
known as regularizer in the variational model. We review two methodically
different approaches, namely optimizing the maximum log-likelihood of the patch
distribution, and penalizing Wasserstein-like discrepancies of whole empirical
patch distributions. From the point of view of Bayesian inverse problems, we
show how we can achieve uncertainty quantification by approximating the
posterior using Langevin Monte Carlo methods. We demonstrate the power of the
methods in computed tomography, image super-resolution, and inpainting. Indeed,
the approach provides also high-quality results in zero-shot super-resolution,
where only a low-resolution image is available. The paper is accompanied by a
GitHub repository containing implementations of all methods as well as data
examples so that the reader can get their own insight into the performance.
Related papers
- Learning Image Priors through Patch-based Diffusion Models for Solving Inverse Problems [15.298502168256519]
Diffusion models can learn strong image priors from underlying data distribution and use them to solve inverse problems, but the training process is computationally expensive and requires lots of data.
This paper proposes a method to learn an efficient data prior for the entire image by training diffusion models only on patches of images.
arXiv Detail & Related papers (2024-06-04T16:30:37Z) - One-Shot Image Restoration [0.0]
Experimental results demonstrate the applicability, robustness and computational efficiency of the proposed approach for supervised image deblurring and super-resolution.
Our results showcase significant improvement of learning models' sample efficiency, generalization and time complexity.
arXiv Detail & Related papers (2024-04-26T14:03:23Z) - UMat: Uncertainty-Aware Single Image High Resolution Material Capture [2.416160525187799]
We propose a learning-based method to recover normals, specularity, and roughness from a single diffuse image of a material.
Our method is the first one to deal with the problem of modeling uncertainty in material digitization.
arXiv Detail & Related papers (2023-05-25T17:59:04Z) - PatchNR: Learning from Small Data by Patch Normalizing Flow
Regularization [57.37911115888587]
We introduce a regularizer for the variational modeling of inverse problems in imaging based on normalizing flows.
Our regularizer, called patchNR, involves a normalizing flow learned on patches of very few images.
arXiv Detail & Related papers (2022-05-24T12:14:26Z) - Learning Graph Regularisation for Guided Super-Resolution [77.7568596501908]
We introduce a novel formulation for guided super-resolution.
Its core is a differentiable optimisation layer that operates on a learned affinity graph.
We extensively evaluate our method on several datasets, and consistently outperform recent baselines in terms of quantitative reconstruction errors.
arXiv Detail & Related papers (2022-03-27T13:12:18Z) - Mining the manifolds of deep generative models for multiple
data-consistent solutions of ill-posed tomographic imaging problems [10.115302976900445]
Tomographic imaging is in general an ill-posed inverse problem.
We propose a new empirical sampling method that computes multiple solutions of a tomographic inverse problem.
arXiv Detail & Related papers (2022-02-10T20:27:31Z) - Learning Discriminative Shrinkage Deep Networks for Image Deconvolution [122.79108159874426]
We propose an effective non-blind deconvolution approach by learning discriminative shrinkage functions to implicitly model these terms.
Experimental results show that the proposed method performs favorably against the state-of-the-art ones in terms of efficiency and accuracy.
arXiv Detail & Related papers (2021-11-27T12:12:57Z) - Shared Prior Learning of Energy-Based Models for Image Reconstruction [69.72364451042922]
We propose a novel learning-based framework for image reconstruction particularly designed for training without ground truth data.
In the absence of ground truth data, we change the loss functional to a patch-based Wasserstein functional.
In shared prior learning, both aforementioned optimal control problems are optimized simultaneously with shared learned parameters of the regularizer.
arXiv Detail & Related papers (2020-11-12T17:56:05Z) - Deep Variational Network Toward Blind Image Restoration [60.45350399661175]
Blind image restoration is a common yet challenging problem in computer vision.
We propose a novel blind image restoration method, aiming to integrate both the advantages of them.
Experiments on two typical blind IR tasks, namely image denoising and super-resolution, demonstrate that the proposed method achieves superior performance over current state-of-the-arts.
arXiv Detail & Related papers (2020-08-25T03:30:53Z) - Data Augmentation for Histopathological Images Based on
Gaussian-Laplacian Pyramid Blending [59.91656519028334]
Data imbalance is a major problem that affects several machine learning (ML) algorithms.
In this paper, we propose a novel approach capable of not only augmenting HI dataset but also distributing the inter-patient variability.
Experimental results on the BreakHis dataset have shown promising gains vis-a-vis the majority of DA techniques presented in the literature.
arXiv Detail & Related papers (2020-01-31T22:02:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.