Principal Component Hierarchy for Sparse Quadratic Programs
- URL: http://arxiv.org/abs/2105.12022v1
- Date: Tue, 25 May 2021 15:45:16 GMT
- Title: Principal Component Hierarchy for Sparse Quadratic Programs
- Authors: Robbie Vreugdenhil, Viet Anh Nguyen, Armin Eftekhari, Peyman Mohajerin
Esfahani
- Abstract summary: We propose a novel approximation hierarchy for cardinality-constrained, convex quadratic programs.
We show that the proposed methods are competitive with the existing screening methods in the current sparse regression.
- Score: 27.812191030127618
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a novel approximation hierarchy for cardinality-constrained,
convex quadratic programs that exploits the rank-dominating eigenvectors of the
quadratic matrix. Each level of approximation admits a min-max characterization
whose objective function can be optimized over the binary variables
analytically, while preserving convexity in the continuous variables.
Exploiting this property, we propose two scalable optimization algorithms,
coined as the "best response" and the "dual program", that can efficiently
screen the potential indices of the nonzero elements of the original program.
We show that the proposed methods are competitive with the existing screening
methods in the current sparse regression literature, and it is particularly
fast on instances with high number of measurements in experiments with both
synthetic and real datasets.
Related papers
- Variable Substitution and Bilinear Programming for Aligning Partially Overlapping Point Sets [48.1015832267945]
This research presents a method to meet requirements through the minimization objective function of the RPM algorithm.
A branch-and-bound (BnB) algorithm is devised, which solely branches over the parameters, thereby boosting convergence rate.
Empirical evaluations demonstrate better robustness of the proposed methodology against non-rigid deformation, positional noise, and outliers, when compared with prevailing state-of-the-art transformations.
arXiv Detail & Related papers (2024-05-14T13:28:57Z) - Online estimation of the inverse of the Hessian for stochastic optimization with application to universal stochastic Newton algorithms [4.389938747401259]
This paper addresses second-order optimization for estimating the minimizer of a convex function written as an expectation.
A direct recursive estimation technique for the inverse Hessian matrix using a Robbins-Monro procedure is introduced.
Above all, it allows to develop universal Newton methods and investigate the efficiency of the proposed approach.
arXiv Detail & Related papers (2024-01-15T13:58:30Z) - Sample Complexity for Quadratic Bandits: Hessian Dependent Bounds and
Optimal Algorithms [64.10576998630981]
We show the first tight characterization of the optimal Hessian-dependent sample complexity.
A Hessian-independent algorithm universally achieves the optimal sample complexities for all Hessian instances.
The optimal sample complexities achieved by our algorithm remain valid for heavy-tailed noise distributions.
arXiv Detail & Related papers (2023-06-21T17:03:22Z) - Accelerated First-Order Optimization under Nonlinear Constraints [73.2273449996098]
We exploit between first-order algorithms for constrained optimization and non-smooth systems to design a new class of accelerated first-order algorithms.
An important property of these algorithms is that constraints are expressed in terms of velocities instead of sparse variables.
arXiv Detail & Related papers (2023-02-01T08:50:48Z) - Efficient Global Planning in Large MDPs via Stochastic Primal-Dual
Optimization [12.411844611718958]
We show that our method outputs a near-optimal policy after a number of queries to the generative model.
Our method is computationally efficient and comes with the major advantage that it outputs a single softmax policy that is compactly represented by a low-dimensional parameter vector.
arXiv Detail & Related papers (2022-10-21T15:49:20Z) - Non-Convex Optimization with Certificates and Fast Rates Through Kernel
Sums of Squares [68.8204255655161]
We consider potentially non- optimized approximation problems.
In this paper, we propose an algorithm that achieves close to optimal a priori computational guarantees.
arXiv Detail & Related papers (2022-04-11T09:37:04Z) - On the implementation of a global optimization method for mixed-variable
problems [0.30458514384586394]
The algorithm is based on the radial basis function of Gutmann and the metric response surface method of Regis and Shoemaker.
We propose several modifications aimed at generalizing and improving these two algorithms.
arXiv Detail & Related papers (2020-09-04T13:36:56Z) - Effective Dimension Adaptive Sketching Methods for Faster Regularized
Least-Squares Optimization [56.05635751529922]
We propose a new randomized algorithm for solving L2-regularized least-squares problems based on sketching.
We consider two of the most popular random embeddings, namely, Gaussian embeddings and the Subsampled Randomized Hadamard Transform (SRHT)
arXiv Detail & Related papers (2020-06-10T15:00:09Z) - Optimal Randomized First-Order Methods for Least-Squares Problems [56.05635751529922]
This class of algorithms encompasses several randomized methods among the fastest solvers for least-squares problems.
We focus on two classical embeddings, namely, Gaussian projections and subsampled Hadamard transforms.
Our resulting algorithm yields the best complexity known for solving least-squares problems with no condition number dependence.
arXiv Detail & Related papers (2020-02-21T17:45:32Z) - Statistical Outlier Identification in Multi-robot Visual SLAM using
Expectation Maximization [18.259478519717426]
This paper introduces a novel and distributed method for detecting inter-map loop closure outliers in simultaneous localization and mapping (SLAM)
The proposed algorithm does not rely on a good initialization and can handle more than two maps at a time.
arXiv Detail & Related papers (2020-02-07T06:34:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.