Neural KEM: A Kernel Method with Deep Coefficient Prior for PET Image
Reconstruction
- URL: http://arxiv.org/abs/2201.01443v1
- Date: Wed, 5 Jan 2022 04:12:38 GMT
- Title: Neural KEM: A Kernel Method with Deep Coefficient Prior for PET Image
Reconstruction
- Authors: Siqi Li, Kuang Gong, Ramsey D. Badawi, Edward J. Kim, Jinyi Qi, and
Guobao Wang
- Abstract summary: We propose an implicit regularization for the kernel method by using a deep coefficient prior.
To solve the maximum-likelihood neural network-based reconstruction problem, we apply the principle of optimization transfer to derive a neural KEM algorithm.
The results from computer simulations and real patient data have demonstrated that the neural KEM can outperform existing KEM and deep image prior methods.
- Score: 10.619539066260154
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Image reconstruction of low-count positron emission tomography (PET) data is
challenging. Kernel methods address the challenge by incorporating image prior
information in the forward model of iterative PET image reconstruction. The
kernelized expectation-maximization (KEM) algorithm has been developed and
demonstrated to be effective and easy to implement. A common approach for a
further improvement of the kernel method would be adding an explicit
regularization, which however leads to a complex optimization problem. In this
paper, we propose an implicit regularization for the kernel method by using a
deep coefficient prior, which represents the kernel coefficient image in the
PET forward model using a convolutional neural-network. To solve the
maximum-likelihood neural network-based reconstruction problem, we apply the
principle of optimization transfer to derive a neural KEM algorithm. Each
iteration of the algorithm consists of two separate steps: a KEM step for image
update from the projection data and a deep-learning step in the image domain
for updating the kernel coefficient image using the neural network. This
optimization algorithm is guaranteed to monotonically increase the data
likelihood. The results from computer simulations and real patient data have
demonstrated that the neural KEM can outperform existing KEM and deep image
prior methods.
Related papers
- An Optimization-based Deep Equilibrium Model for Hyperspectral Image
Deconvolution with Convergence Guarantees [71.57324258813675]
We propose a novel methodology for addressing the hyperspectral image deconvolution problem.
A new optimization problem is formulated, leveraging a learnable regularizer in the form of a neural network.
The derived iterative solver is then expressed as a fixed-point calculation problem within the Deep Equilibrium framework.
arXiv Detail & Related papers (2023-06-10T08:25:16Z) - Optimization-Based Deep learning methods for Magnetic Resonance Imaging
Reconstruction and Synthesis [0.0]
This dissertation aims to provide advanced nonsmooth variational models (Magnetic Resonance Image) MRI reconstruction, efficient learnable image reconstruction algorithms, and deep learning methods for MRI reconstruction and synthesis.
The first part introduces a novel based deep neural network whose architecture is inspired by proximal gradient descent for a variational model.
The second part is a substantial extension of the preliminary work in the first part by solving the calibration-free fast pMRI reconstruction problem in a discrete-time optimal framework.
The third part aims at developing a generalizable Magnetic Resonance Imaging (MRI) reconstruction method in the metalearning framework.
arXiv Detail & Related papers (2023-03-02T18:59:44Z) - Enhanced Sharp-GAN For Histopathology Image Synthesis [63.845552349914186]
Histopathology image synthesis aims to address the data shortage issue in training deep learning approaches for accurate cancer detection.
We propose a novel approach that enhances the quality of synthetic images by using nuclei topology and contour regularization.
The proposed approach outperforms Sharp-GAN in all four image quality metrics on two datasets.
arXiv Detail & Related papers (2023-01-24T17:54:01Z) - Towards Theoretically Inspired Neural Initialization Optimization [66.04735385415427]
We propose a differentiable quantity, named GradCosine, with theoretical insights to evaluate the initial state of a neural network.
We show that both the training and test performance of a network can be improved by maximizing GradCosine under norm constraint.
Generalized from the sample-wise analysis into the real batch setting, NIO is able to automatically look for a better initialization with negligible cost.
arXiv Detail & Related papers (2022-10-12T06:49:16Z) - Learning Optimal K-space Acquisition and Reconstruction using
Physics-Informed Neural Networks [46.751292014516025]
Deep neural networks have been applied to reconstruct undersampled k-space data and have shown improved reconstruction performance.
This work proposes a novel framework to learn k-space sampling trajectories by considering it as an Ordinary Differential Equation (ODE) problem.
Experiments were conducted on different in-viv datasets (textite.g., brain and knee images) acquired with different sequences.
arXiv Detail & Related papers (2022-04-05T20:28:42Z) - Image reconstruction algorithms in radio interferometry: from
handcrafted to learned denoisers [7.1439425093981574]
We introduce a new class of iterative image reconstruction algorithms for radio interferometry, inspired by plug-and-play methods.
The approach consists in learning a prior image model by training a deep neural network (DNN) as a denoiser.
We plug the learned denoiser into the forward-backward optimization algorithm, resulting in a simple iterative structure alternating a denoising step with a gradient-descent data-fidelity step.
arXiv Detail & Related papers (2022-02-25T20:26:33Z) - Deep Kernel Representation for Image Reconstruction in PET [9.041102353158065]
A deep kernel method is proposed by exploiting deep neural networks to enable an automated learning of an optimized kernel model.
The results from computer simulations and a real patient dataset demonstrate that the proposed deep kernel method can outperform existing kernel method and neural network method for dynamic PET image reconstruction.
arXiv Detail & Related papers (2021-10-04T03:53:33Z) - NerfingMVS: Guided Optimization of Neural Radiance Fields for Indoor
Multi-view Stereo [97.07453889070574]
We present a new multi-view depth estimation method that utilizes both conventional SfM reconstruction and learning-based priors.
We show that our proposed framework significantly outperforms state-of-the-art methods on indoor scenes.
arXiv Detail & Related papers (2021-09-02T17:54:31Z) - Direct PET Image Reconstruction Incorporating Deep Image Prior and a
Forward Projection Model [0.0]
Convolutional neural networks (CNNs) have recently achieved remarkable performance in positron emission tomography (PET) image reconstruction.
We propose an unsupervised direct PET image reconstruction method that incorporates a deep image prior framework.
Our proposed method incorporates a forward projection model with a loss function to achieve unsupervised direct PET image reconstruction from sinograms.
arXiv Detail & Related papers (2021-09-02T08:07:58Z) - PET Image Reconstruction with Multiple Kernels and Multiple Kernel Space
Regularizers [3.968853026164666]
We present a regularized kernelized MLEM with multiple kernel matrices and multiple kernel space regularizers that can be tailored for different applications.
New algorithms are derived using the technical tools of multi- Kernel combination in machine learning, image dictionary learning in sparse coding, and graph Laplcian in graph signal processing.
arXiv Detail & Related papers (2021-03-04T03:28:17Z) - MSE-Optimal Neural Network Initialization via Layer Fusion [68.72356718879428]
Deep neural networks achieve state-of-the-art performance for a range of classification and inference tasks.
The use of gradient combined nonvolutionity renders learning susceptible to novel problems.
We propose fusing neighboring layers of deeper networks that are trained with random variables.
arXiv Detail & Related papers (2020-01-28T18:25:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.