Computed Tomography Reconstruction Using Deep Image Prior and Learned
Reconstruction Methods
- URL: http://arxiv.org/abs/2003.04989v2
- Date: Thu, 12 Mar 2020 12:09:52 GMT
- Title: Computed Tomography Reconstruction Using Deep Image Prior and Learned
Reconstruction Methods
- Authors: Daniel Otero Baguer, Johannes Leuschner, Maximilian Schmidt
- Abstract summary: In this work, we investigate the application of deep learning methods for computed tomography in the context of having a low-data regime.
We find that the learned primal-dual has an outstanding performance in terms of reconstruction quality and data efficiency.
The proposed methods improve the state-of-the-art results in the low data-regime.
- Score: 0.8263596314702016
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this work, we investigate the application of deep learning methods for
computed tomography in the context of having a low-data regime. As motivation,
we review some of the existing approaches and obtain quantitative results after
training them with different amounts of data. We find that the learned
primal-dual has an outstanding performance in terms of reconstruction quality
and data efficiency. However, in general, end-to-end learned methods have two
issues: a) lack of classical guarantees in inverse problems and b) lack of
generalization when not trained with enough data. To overcome these issues, we
bring in the deep image prior approach in combination with classical
regularization. The proposed methods improve the state-of-the-art results in
the low data-regime.
Related papers
- Enhancing Consistency and Mitigating Bias: A Data Replay Approach for
Incremental Learning [100.7407460674153]
Deep learning systems are prone to catastrophic forgetting when learning from a sequence of tasks.
To mitigate the problem, a line of methods propose to replay the data of experienced tasks when learning new tasks.
However, it is not expected in practice considering the memory constraint or data privacy issue.
As a replacement, data-free data replay methods are proposed by inverting samples from the classification model.
arXiv Detail & Related papers (2024-01-12T12:51:12Z) - Learning from small data sets: Patch-based regularizers in inverse
problems for image reconstruction [1.1650821883155187]
Recent advances in machine learning require a huge amount of data and computer capacity to train the networks.
Our paper addresses the issue of learning from small data sets by taking patches of very few images into account.
We show how we can achieve uncertainty quantification by approximating the posterior using Langevin Monte Carlo methods.
arXiv Detail & Related papers (2023-12-27T15:30:05Z) - Learning Discriminative Shrinkage Deep Networks for Image Deconvolution [122.79108159874426]
We propose an effective non-blind deconvolution approach by learning discriminative shrinkage functions to implicitly model these terms.
Experimental results show that the proposed method performs favorably against the state-of-the-art ones in terms of efficiency and accuracy.
arXiv Detail & Related papers (2021-11-27T12:12:57Z) - Always Be Dreaming: A New Approach for Data-Free Class-Incremental
Learning [73.24988226158497]
We consider the high-impact problem of Data-Free Class-Incremental Learning (DFCIL)
We propose a novel incremental distillation strategy for DFCIL, contributing a modified cross-entropy training and importance-weighted feature distillation.
Our method results in up to a 25.1% increase in final task accuracy (absolute difference) compared to SOTA DFCIL methods for common class-incremental benchmarks.
arXiv Detail & Related papers (2021-06-17T17:56:08Z) - A Survey on Deep Semi-supervised Learning [51.26862262550445]
We first present a taxonomy for deep semi-supervised learning that categorizes existing methods.
We then offer a detailed comparison of these methods in terms of the type of losses, contributions, and architecture differences.
arXiv Detail & Related papers (2021-02-28T16:22:58Z) - Image Restoration by Deep Projected GSURE [115.57142046076164]
Ill-posed inverse problems appear in many image processing applications, such as deblurring and super-resolution.
We propose a new image restoration framework that is based on minimizing a loss function that includes a "projected-version" of the Generalized SteinUnbiased Risk Estimator (GSURE) and parameterization of the latent image by a CNN.
arXiv Detail & Related papers (2021-02-04T08:52:46Z) - Deep Optimized Priors for 3D Shape Modeling and Reconstruction [38.79018852887249]
We introduce a new learning framework for 3D modeling and reconstruction.
We show that the proposed strategy effectively breaks the barriers constrained by the pre-trained priors.
arXiv Detail & Related papers (2020-12-14T03:56:31Z) - Monocular Depth Estimation via Listwise Ranking using the Plackett-Luce
Model [15.472533971305367]
In many real-world applications, the relative depth of objects in an image is crucial for scene understanding.
Recent approaches mainly tackle the problem of depth prediction in monocular images by treating the problem as a regression task.
Yet, ranking methods suggest themselves as a natural alternative to regression, and indeed, ranking approaches leveraging pairwise comparisons have shown promising performance on this problem.
arXiv Detail & Related papers (2020-10-25T13:40:10Z) - Blind Image Restoration with Flow Based Priors [19.190289348734215]
In a blind setting with unknown degradations, a good prior remains crucial.
We propose using normalizing flows to model the distribution of the target content and to use this as a prior in a maximum a posteriori (MAP) formulation.
To the best of our knowledge, this is the first work that explores normalizing flows as prior in image enhancement problems.
arXiv Detail & Related papers (2020-09-09T21:40:11Z) - Continual Deep Learning by Functional Regularisation of Memorable Past [95.97578574330934]
Continually learning new skills is important for intelligent systems, yet standard deep learning methods suffer from catastrophic forgetting of the past.
We propose a new functional-regularisation approach that utilises a few memorable past examples crucial to avoid forgetting.
Our method achieves state-of-the-art performance on standard benchmarks and opens a new direction for life-long learning where regularisation and memory-based methods are naturally combined.
arXiv Detail & Related papers (2020-04-29T10:47:54Z) - Learning regularization and intensity-gradient-based fidelity for single
image super resolution [0.0]
We study the image degradation progress, and establish degradation model both in intensity and gradient space.
A comprehensive data consistency constraint is established for the reconstruction.
The proposed fidelity term and designed regularization term are embedded into the regularization framework.
arXiv Detail & Related papers (2020-03-24T07:03:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.