A dual semismooth Newton based augmented Lagrangian method for
large-scale linearly constrained sparse group square-root Lasso problems
- URL: http://arxiv.org/abs/2111.13878v1
- Date: Sat, 27 Nov 2021 12:20:43 GMT
- Title: A dual semismooth Newton based augmented Lagrangian method for
large-scale linearly constrained sparse group square-root Lasso problems
- Authors: Chengjing Wang and Peipei Tang
- Abstract summary: This paper focuses on the numerical computation of large-scale linearly constrained sparse group square-root Lasso problems.
We propose a dual semismooth Newton (SSN) based augmented Lagrangian method (ALM) for it.
Numerical experiments demonstrate the efficiency of the proposed algorithm.
- Score: 0.0
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Square-root Lasso problems are proven robust regression problems.
Furthermore, square-root regression problems with structured sparsity also
plays an important role in statistics and machine learning. In this paper, we
focus on the numerical computation of large-scale linearly constrained sparse
group square-root Lasso problems. In order to overcome the difficulty that
there are two nonsmooth terms in the objective function, we propose a dual
semismooth Newton (SSN) based augmented Lagrangian method (ALM) for it. That
is, we apply the ALM to the dual problem with the subproblem solved by the SSN
method. To apply the SSN method, the positive definiteness of the generalized
Jacobian is very important. Hence we characterize the equivalence of its
positive definiteness and the constraint nondegeneracy condition of the
corresponding primal problem. In numerical implementation, we fully employ the
second order sparsity so that the Newton direction can be efficiently obtained.
Numerical experiments demonstrate the efficiency of the proposed algorithm.
Related papers
- Trust-Region Sequential Quadratic Programming for Stochastic Optimization with Random Models [57.52124921268249]
We propose a Trust Sequential Quadratic Programming method to find both first and second-order stationary points.
To converge to first-order stationary points, our method computes a gradient step in each iteration defined by minimizing a approximation of the objective subject.
To converge to second-order stationary points, our method additionally computes an eigen step to explore the negative curvature the reduced Hessian matrix.
arXiv Detail & Related papers (2024-09-24T04:39:47Z) - A Sample Efficient Alternating Minimization-based Algorithm For Robust Phase Retrieval [56.67706781191521]
In this work, we present a robust phase retrieval problem where the task is to recover an unknown signal.
Our proposed oracle avoids the need for computationally spectral descent, using a simple gradient step and outliers.
arXiv Detail & Related papers (2024-09-07T06:37:23Z) - Constrained Optimization via Exact Augmented Lagrangian and Randomized
Iterative Sketching [55.28394191394675]
We develop an adaptive inexact Newton method for equality-constrained nonlinear, nonIBS optimization problems.
We demonstrate the superior performance of our method on benchmark nonlinear problems, constrained logistic regression with data from LVM, and a PDE-constrained problem.
arXiv Detail & Related papers (2023-05-28T06:33:37Z) - Stochastic Inexact Augmented Lagrangian Method for Nonconvex Expectation
Constrained Optimization [88.0031283949404]
Many real-world problems have complicated non functional constraints and use a large number of data points.
Our proposed method outperforms an existing method with the previously best-known result.
arXiv Detail & Related papers (2022-12-19T14:48:54Z) - Symmetrized Robust Procrustes: Constant-Factor Approximation and Exact
Recovery [6.157087562708549]
We propose a novel convex relaxation for the Robust Procrustes problem.
We find that our method performs similarly to the standard Iteratively Reweighted Least Squares.
arXiv Detail & Related papers (2022-07-18T13:32:20Z) - Smooth Bilevel Programming for Sparse Regularization [5.177947445379688]
Iteratively reweighted least square (IRLS) is a popular approach to solve sparsity-enforcing regression problems in machine learning.
We show how a surprisingly reparametrization of IRLS, coupled with a bilevel scheme, achieves topranging of sparsity.
arXiv Detail & Related papers (2021-06-02T19:18:22Z) - A Stochastic Composite Augmented Lagrangian Method For Reinforcement
Learning [9.204659134755795]
We consider the linear programming (LP) formulation for deep reinforcement learning.
The augmented Lagrangian method suffers the double-sampling obstacle in solving the LP.
A deep parameterized augment Lagrangian method is proposed.
arXiv Detail & Related papers (2021-05-20T13:08:06Z) - Hybrid Trilinear and Bilinear Programming for Aligning Partially
Overlapping Point Sets [85.71360365315128]
In many applications, we need algorithms which can align partially overlapping point sets are invariant to the corresponding corresponding RPM algorithm.
We first show that the objective is a cubic bound function. We then utilize the convex envelopes of trilinear and bilinear monomial transformations to derive its lower bound.
We next develop a branch-and-bound (BnB) algorithm which only branches over the transformation variables and runs efficiently.
arXiv Detail & Related papers (2021-01-19T04:24:23Z) - Matrix-wise $\ell_0$-constrained Sparse Nonnegative Least Squares [22.679160149512377]
Nonnegative least squares problems with multiple right-hand sides (MNNLS) arise in models that rely on additive linear combinations.
We introduce a novel formulation for sparse MNNLS, with a matrix-wise sparsity constraint.
We show that our proposed two-step approach provides more accurate results than state-of-the-art sparse codings applied both column-wise and globally.
arXiv Detail & Related papers (2020-11-22T17:21:16Z) - Effective Dimension Adaptive Sketching Methods for Faster Regularized
Least-Squares Optimization [56.05635751529922]
We propose a new randomized algorithm for solving L2-regularized least-squares problems based on sketching.
We consider two of the most popular random embeddings, namely, Gaussian embeddings and the Subsampled Randomized Hadamard Transform (SRHT)
arXiv Detail & Related papers (2020-06-10T15:00:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.