OKRidge: Scalable Optimal k-Sparse Ridge Regression
- URL: http://arxiv.org/abs/2304.06686v3
- Date: Thu, 11 Jan 2024 20:27:54 GMT
- Title: OKRidge: Scalable Optimal k-Sparse Ridge Regression
- Authors: Jiachang Liu, Sam Rosen, Chudi Zhong, Cynthia Rudin
- Abstract summary: We propose a fast algorithm, OKRidge, for sparse ridge regression.
We also propose a method to warm-start our solver, which leverages a beam search.
- Score: 21.17964202317435
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We consider an important problem in scientific discovery, namely identifying
sparse governing equations for nonlinear dynamical systems. This involves
solving sparse ridge regression problems to provable optimality in order to
determine which terms drive the underlying dynamics. We propose a fast
algorithm, OKRidge, for sparse ridge regression, using a novel lower bound
calculation involving, first, a saddle point formulation, and from there,
either solving (i) a linear system or (ii) using an ADMM-based approach, where
the proximal operators can be efficiently evaluated by solving another linear
system and an isotonic regression problem. We also propose a method to
warm-start our solver, which leverages a beam search. Experimentally, our
methods attain provable optimality with run times that are orders of magnitude
faster than those of the existing MIP formulations solved by the commercial
solver Gurobi.
Related papers
- A Sample Efficient Alternating Minimization-based Algorithm For Robust Phase Retrieval [56.67706781191521]
In this work, we present a robust phase retrieval problem where the task is to recover an unknown signal.
Our proposed oracle avoids the need for computationally spectral descent, using a simple gradient step and outliers.
arXiv Detail & Related papers (2024-09-07T06:37:23Z) - An Optimization-based Deep Equilibrium Model for Hyperspectral Image
Deconvolution with Convergence Guarantees [71.57324258813675]
We propose a novel methodology for addressing the hyperspectral image deconvolution problem.
A new optimization problem is formulated, leveraging a learnable regularizer in the form of a neural network.
The derived iterative solver is then expressed as a fixed-point calculation problem within the Deep Equilibrium framework.
arXiv Detail & Related papers (2023-06-10T08:25:16Z) - Constrained Optimization via Exact Augmented Lagrangian and Randomized
Iterative Sketching [55.28394191394675]
We develop an adaptive inexact Newton method for equality-constrained nonlinear, nonIBS optimization problems.
We demonstrate the superior performance of our method on benchmark nonlinear problems, constrained logistic regression with data from LVM, and a PDE-constrained problem.
arXiv Detail & Related papers (2023-05-28T06:33:37Z) - Sparse high-dimensional linear regression with a partitioned empirical
Bayes ECM algorithm [62.997667081978825]
We propose a computationally efficient and powerful Bayesian approach for sparse high-dimensional linear regression.
Minimal prior assumptions on the parameters are used through the use of plug-in empirical Bayes estimates.
The proposed approach is implemented in the R package probe.
arXiv Detail & Related papers (2022-09-16T19:15:50Z) - Learning Sparse Nonlinear Dynamics via Mixed-Integer Optimization [3.7565501074323224]
We propose an exact formulation of the SINDyDy problem using mixed-integer optimization (MIO) to solve the sparsity constrained regression problem to provable optimality in seconds.
We illustrate the dramatic improvement of our approach in accurate model discovery while being more sample efficient, robust to noise, and flexible in accommodating physical constraints.
arXiv Detail & Related papers (2022-06-01T01:43:45Z) - Unsupervised Reservoir Computing for Solving Ordinary Differential
Equations [1.6371837018687636]
unsupervised reservoir computing (RC), an echo-state recurrent neural network capable of discovering approximate solutions that satisfy ordinary differential equations (ODEs)
We use Bayesian optimization to efficiently discover optimal sets in a high-dimensional hyper- parameter space and numerically show that one set is robust and can be used to solve an ODE for different initial conditions and time ranges.
arXiv Detail & Related papers (2021-08-25T18:16:42Z) - Sparse Bayesian Learning via Stepwise Regression [1.2691047660244335]
We propose a coordinate ascent algorithm for SBL termed Relevance Matching Pursuit (RMP)
As its noise variance parameter goes to zero, RMP exhibits a surprising connection to Stepwise Regression.
We derive novel guarantees for Stepwise Regression algorithms, which also shed light on RMP.
arXiv Detail & Related papers (2021-06-11T00:20:27Z) - Solving weakly supervised regression problem using low-rank manifold
regularization [77.34726150561087]
We solve a weakly supervised regression problem.
Under "weakly" we understand that for some training points the labels are known, for some unknown, and for others uncertain due to the presence of random noise or other reasons such as lack of resources.
In the numerical section, we applied the suggested method to artificial and real datasets using Monte-Carlo modeling.
arXiv Detail & Related papers (2021-04-13T23:21:01Z) - Combining Deep Learning and Optimization for Security-Constrained
Optimal Power Flow [94.24763814458686]
Security-constrained optimal power flow (SCOPF) is fundamental in power systems.
Modeling of APR within the SCOPF problem results in complex large-scale mixed-integer programs.
This paper proposes a novel approach that combines deep learning and robust optimization techniques.
arXiv Detail & Related papers (2020-07-14T12:38:21Z) - Symbolic Regression using Mixed-Integer Nonlinear Optimization [9.638685454900047]
The Symbolic Regression (SR) problem is a hard problem in machine learning.
We propose a hybrid algorithm that combines mixed-integer nonlinear optimization with explicit enumeration.
We show that our algorithm is competitive, for some synthetic data sets, with a state-of-the-art SR software and a recent physics-inspired method called AI Feynman.
arXiv Detail & Related papers (2020-06-11T20:53:17Z) - Sparse recovery by reduced variance stochastic approximation [5.672132510411465]
We discuss application of iterative quadratic optimization routines to the problem of sparse signal recovery from noisy observation.
We show how one can straightforwardly enhance reliability of the corresponding solution by using Median-of-Means like techniques.
arXiv Detail & Related papers (2020-06-11T12:31:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.