Learning Convex Optimization Models
- URL: http://arxiv.org/abs/2006.04248v2
- Date: Thu, 18 Jun 2020 16:55:04 GMT
- Title: Learning Convex Optimization Models
- Authors: Akshay Agrawal, Shane Barratt, Stephen Boyd
- Abstract summary: A convex optimization model predicts an output from an input by solving a convex optimization problem.
We propose methods for learning the parameters in a convex optimization model given a dataset of input-output pairs.
- Score: 0.5524804393257919
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A convex optimization model predicts an output from an input by solving a
convex optimization problem. The class of convex optimization models is large,
and includes as special cases many well-known models like linear and logistic
regression. We propose a heuristic for learning the parameters in a convex
optimization model given a dataset of input-output pairs, using recently
developed methods for differentiating the solution of a convex optimization
problem with respect to its parameters. We describe three general classes of
convex optimization models, maximum a posteriori (MAP) models, utility
maximization models, and agent models, and present a numerical experiment for
each.
Related papers
- Efficient Optimization Algorithms for Linear Adversarial Training [9.933836677441684]
Adversarial training can be used to learn models that are robust against perturbations.
We propose tailored optimization algorithms for the adversarial training of linear models.
arXiv Detail & Related papers (2024-10-16T15:41:08Z) - Differentially Private Optimization with Sparse Gradients [60.853074897282625]
We study differentially private (DP) optimization problems under sparsity of individual gradients.
Building on this, we obtain pure- and approximate-DP algorithms with almost optimal rates for convex optimization with sparse gradients.
arXiv Detail & Related papers (2024-04-16T20:01:10Z) - Functional Graphical Models: Structure Enables Offline Data-Driven Optimization [111.28605744661638]
We show how structure can enable sample-efficient data-driven optimization.
We also present a data-driven optimization algorithm that infers the FGM structure itself.
arXiv Detail & Related papers (2024-01-08T22:33:14Z) - The Convex Landscape of Neural Networks: Characterizing Global Optima
and Stationary Points via Lasso Models [75.33431791218302]
Deep Neural Network Network (DNN) models are used for programming purposes.
In this paper we examine the use of convex neural recovery models.
We show that all the stationary non-dimensional objective objective can be characterized as the standard a global subsampled convex solvers program.
We also show that all the stationary non-dimensional objective objective can be characterized as the standard a global subsampled convex solvers program.
arXiv Detail & Related papers (2023-12-19T23:04:56Z) - Generative Models for Anomaly Detection and Design-Space Dimensionality
Reduction in Shape Optimization [0.0]
Our work presents a novel approach to shape optimization, with the twofold objective to improve the efficiency of global algorithms and to promote the generation of high-quality designs.
This is accomplished by reducing the number of the original design variables defining a new reduced subspace where the geometrical variance is maximized.
From the numerical results, the new framework improves the convergence of global optimization algorithms, while only designs with high-quality geometrical features are generated.
arXiv Detail & Related papers (2023-08-08T04:57:58Z) - FineMorphs: Affine-diffeomorphic sequences for regression [1.1421942894219896]
The model states are optimally "reshaped" by diffeomorphisms generated by smooth vector fields during learning.
Affine transformations and vector fields are optimized within an optimal control setting.
The model can naturally reduce (or increase) dimensionality and adapt to large datasets via suboptimal vector fields.
arXiv Detail & Related papers (2023-05-26T20:54:18Z) - Backpropagation of Unrolled Solvers with Folded Optimization [55.04219793298687]
The integration of constrained optimization models as components in deep networks has led to promising advances on many specialized learning tasks.
One typical strategy is algorithm unrolling, which relies on automatic differentiation through the operations of an iterative solver.
This paper provides theoretical insights into the backward pass of unrolled optimization, leading to a system for generating efficiently solvable analytical models of backpropagation.
arXiv Detail & Related papers (2023-01-28T01:50:42Z) - Learning to Refit for Convex Learning Problems [11.464758257681197]
We propose a framework to learn to estimate optimized model parameters for different training sets using neural networks.
We rigorously characterize the power of neural networks to approximate convex problems.
arXiv Detail & Related papers (2021-11-24T15:28:50Z) - Non-parametric Models for Non-negative Functions [48.7576911714538]
We provide the first model for non-negative functions from the same good linear models.
We prove that it admits a representer theorem and provide an efficient dual formulation for convex problems.
arXiv Detail & Related papers (2020-07-08T07:17:28Z) - Automatically Learning Compact Quality-aware Surrogates for Optimization
Problems [55.94450542785096]
Solving optimization problems with unknown parameters requires learning a predictive model to predict the values of the unknown parameters and then solving the problem using these values.
Recent work has shown that including the optimization problem as a layer in a complex training model pipeline results in predictions of iteration of unobserved decision making.
We show that we can improve solution quality by learning a low-dimensional surrogate model of a large optimization problem.
arXiv Detail & Related papers (2020-06-18T19:11:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.