Sparse Signal Reconstruction for Nonlinear Models via Piecewise Rational
Optimization
- URL: http://arxiv.org/abs/2010.15427v2
- Date: Wed, 25 Nov 2020 16:20:22 GMT
- Title: Sparse Signal Reconstruction for Nonlinear Models via Piecewise Rational
Optimization
- Authors: Arthur Marmin and Marc Castella and Jean-Christophe Pesquet and
Laurent Duval
- Abstract summary: We propose a method to reconstruct degraded signals by a nonlinear distortion and at a limited sampling rate.
Our method formulates as a non exact fitting term and a penalization term.
It is shown how to use the problem in terms of the benefits of our simulations.
- Score: 27.080837460030583
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a method to reconstruct sparse signals degraded by a nonlinear
distortion and acquired at a limited sampling rate. Our method formulates the
reconstruction problem as a nonconvex minimization of the sum of a data fitting
term and a penalization term. In contrast with most previous works which settle
for approximated local solutions, we seek for a global solution to the obtained
challenging nonconvex problem. Our global approach relies on the so-called
Lasserre relaxation of polynomial optimization. We here specifically include in
our approach the case of piecewise rational functions, which makes it possible
to address a wide class of nonconvex exact and continuous relaxations of the
$\ell_0$ penalization function. Additionally, we study the complexity of the
optimization problem. It is shown how to use the structure of the problem to
lighten the computational burden efficiently. Finally, numerical simulations
illustrate the benefits of our method in terms of both global optimality and
signal reconstruction.
Related papers
- A Sample Efficient Alternating Minimization-based Algorithm For Robust Phase Retrieval [56.67706781191521]
In this work, we present a robust phase retrieval problem where the task is to recover an unknown signal.
Our proposed oracle avoids the need for computationally spectral descent, using a simple gradient step and outliers.
arXiv Detail & Related papers (2024-09-07T06:37:23Z) - Alternating Minimization Schemes for Computing Rate-Distortion-Perception Functions with $f$-Divergence Perception Constraints [10.564071872770146]
We study the computation of the rate-distortion-perception function (RDPF) for discrete memoryless sources.
We characterize the optimal parametric solutions.
We provide sufficient conditions on the distortion and the perception constraints.
arXiv Detail & Related papers (2024-08-27T12:50:12Z) - Optimal Guarantees for Algorithmic Reproducibility and Gradient
Complexity in Convex Optimization [55.115992622028685]
Previous work suggests that first-order methods would need to trade-off convergence rate (gradient convergence rate) for better.
We demonstrate that both optimal complexity and near-optimal convergence guarantees can be achieved for smooth convex minimization and smooth convex-concave minimax problems.
arXiv Detail & Related papers (2023-10-26T19:56:52Z) - Optimal Sets and Solution Paths of ReLU Networks [56.40911684005949]
We develop an analytical framework to characterize the set of optimal ReLU networks.
We establish conditions for the neuralization of ReLU networks to be continuous, and develop sensitivity results for ReLU networks.
arXiv Detail & Related papers (2023-05-31T18:48:16Z) - Linearization Algorithms for Fully Composite Optimization [61.20539085730636]
This paper studies first-order algorithms for solving fully composite optimization problems convex compact sets.
We leverage the structure of the objective by handling differentiable and non-differentiable separately, linearizing only the smooth parts.
arXiv Detail & Related papers (2023-02-24T18:41:48Z) - Convex mixed-integer optimization with Frank-Wolfe methods [20.37026309402396]
Mixed-integer nonlinear optimization presents both theoretical and computational challenges.
We propose a new type of method to solve these problems based on a branch-and-bound algorithm with convex node relaxations.
arXiv Detail & Related papers (2022-08-23T14:46:54Z) - Efficient Algorithms for High-Dimensional Convex Subspace Optimization
via Strict Complementarity [19.24470467199451]
We consider optimization problems in which the goal is find a $k$ subspace of $realsn$, $k$, which minimizes a convex smooth loss.
While this problem is highly in high-dimensional regimes, it is difficult to find a global optimal solution.
In this paper we present a natural.
determinate optimal dimension relaxation for which convergence to the.
global is straightforward.
arXiv Detail & Related papers (2022-02-08T17:36:43Z) - Optimal Rates for Random Order Online Optimization [60.011653053877126]
We study the citetgarber 2020online, where the loss functions may be chosen by an adversary, but are then presented online in a uniformly random order.
We show that citetgarber 2020online algorithms achieve the optimal bounds and significantly improve their stability.
arXiv Detail & Related papers (2021-06-29T09:48:46Z) - Recent Theoretical Advances in Non-Convex Optimization [56.88981258425256]
Motivated by recent increased interest in analysis of optimization algorithms for non- optimization in deep networks and other problems in data, we give an overview of recent results of theoretical optimization algorithms for non- optimization.
arXiv Detail & Related papers (2020-12-11T08:28:51Z) - Sparse recovery by reduced variance stochastic approximation [5.672132510411465]
We discuss application of iterative quadratic optimization routines to the problem of sparse signal recovery from noisy observation.
We show how one can straightforwardly enhance reliability of the corresponding solution by using Median-of-Means like techniques.
arXiv Detail & Related papers (2020-06-11T12:31:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.