To be or not to be stable, that is the question: understanding neural
networks for inverse problems
- URL: http://arxiv.org/abs/2211.13692v3
- Date: Wed, 7 Feb 2024 06:45:30 GMT
- Title: To be or not to be stable, that is the question: understanding neural
networks for inverse problems
- Authors: Davide Evangelista, James Nagy, Elena Morotti, Elena Loli Piccolomini
- Abstract summary: In this paper, we theoretically analyze the trade-off between stability and accuracy of neural networks.
We propose different supervised and unsupervised solutions to increase the network stability and maintain a good accuracy.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The solution of linear inverse problems arising, for example, in signal and
image processing is a challenging problem since the ill-conditioning amplifies,
in the solution, the noise present in the data. Recently introduced algorithms
based on deep learning overwhelm the more traditional model-based approaches in
performance, but they typically suffer from instability with respect to data
perturbation. In this paper, we theoretically analyze the trade-off between
stability and accuracy of neural networks, when used to solve linear imaging
inverse problems for not under-determined cases. Moreover, we propose different
supervised and unsupervised solutions to increase the network stability and
maintain a good accuracy, by means of regularization properties inherited from
a model-based iterative scheme during the network training and pre-processing
stabilizing operator in the neural networks. Extensive numerical experiments on
image deblurring confirm the theoretical results and the effectiveness of the
proposed deep learning-based approaches to handle noise on the data.
Related papers
- The Unreasonable Effectiveness of Solving Inverse Problems with Neural Networks [24.766470360665647]
We show that neural networks trained to learn solutions to inverse problems can find better solutions than classicals even on their training set.
Our findings suggest an alternative use for neural networks: rather than generalizing to new data for fast inference, they can also be used to find better solutions on known data.
arXiv Detail & Related papers (2024-08-15T12:38:10Z) - Ambiguity in solving imaging inverse problems with deep learning based
operators [0.0]
Large convolutional neural networks have been widely used as tools for image deblurring.
Image deblurring is mathematically modeled as an ill-posed inverse problem and its solution is difficult to approximate when noise affects the data.
In this paper, we propose some strategies to improve stability without losing to much accuracy to deblur images with deep-learning based methods.
arXiv Detail & Related papers (2023-05-31T12:07:08Z) - Cycle Consistency-based Uncertainty Quantification of Neural Networks in
Inverse Imaging Problems [10.992084413881592]
Uncertainty estimation is critical for numerous applications of deep neural networks.
We show an uncertainty quantification approach for deep neural networks used in inverse problems based on cycle consistency.
arXiv Detail & Related papers (2023-05-22T09:23:18Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - On Robust Numerical Solver for ODE via Self-Attention Mechanism [82.95493796476767]
We explore training efficient and robust AI-enhanced numerical solvers with a small data size by mitigating intrinsic noise disturbances.
We first analyze the ability of the self-attention mechanism to regulate noise in supervised learning and then propose a simple-yet-effective numerical solver, Attr, which introduces an additive self-attention mechanism to the numerical solution of differential equations.
arXiv Detail & Related papers (2023-02-05T01:39:21Z) - Deep Equilibrium Assisted Block Sparse Coding of Inter-dependent
Signals: Application to Hyperspectral Imaging [71.57324258813675]
A dataset of inter-dependent signals is defined as a matrix whose columns demonstrate strong dependencies.
A neural network is employed to act as structure prior and reveal the underlying signal interdependencies.
Deep unrolling and Deep equilibrium based algorithms are developed, forming highly interpretable and concise deep-learning-based architectures.
arXiv Detail & Related papers (2022-03-29T21:00:39Z) - Stable, accurate and efficient deep neural networks for inverse problems
with analysis-sparse models [2.969705152497174]
We present a novel construction of an accurate, stable and efficient neural network for inverse problems with general analysis-sparse models.
To construct the network, we unroll NESTA, an accelerated first-order method for convex optimization.
A restart scheme is employed to enable exponential decay of the required network depth, yielding a shallower, and consequently more efficient, network.
arXiv Detail & Related papers (2022-03-02T00:44:25Z) - Imbedding Deep Neural Networks [0.0]
Continuous depth neural networks, such as Neural ODEs, have refashioned the understanding of residual neural networks in terms of non-linear vector-valued optimal control problems.
We propose a new approach which explicates the network's depth' as a fundamental variable, thus reducing the problem to a system of forward-facing initial value problems.
arXiv Detail & Related papers (2022-01-31T22:00:41Z) - Robust lEarned Shrinkage-Thresholding (REST): Robust unrolling for
sparse recover [87.28082715343896]
We consider deep neural networks for solving inverse problems that are robust to forward model mis-specifications.
We design a new robust deep neural network architecture by applying algorithm unfolding techniques to a robust version of the underlying recovery problem.
The proposed REST network is shown to outperform state-of-the-art model-based and data-driven algorithms in both compressive sensing and radar imaging problems.
arXiv Detail & Related papers (2021-10-20T06:15:45Z) - Multivariate Deep Evidential Regression [77.34726150561087]
A new approach with uncertainty-aware neural networks shows promise over traditional deterministic methods.
We discuss three issues with a proposed solution to extract aleatoric and epistemic uncertainties from regression-based neural networks.
arXiv Detail & Related papers (2021-04-13T12:20:18Z) - Non-Singular Adversarial Robustness of Neural Networks [58.731070632586594]
Adrial robustness has become an emerging challenge for neural network owing to its over-sensitivity to small input perturbations.
We formalize the notion of non-singular adversarial robustness for neural networks through the lens of joint perturbations to data inputs as well as model weights.
arXiv Detail & Related papers (2021-02-23T20:59:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.