Efficient algorithms for multivariate shape-constrained convex
regression problems
- URL: http://arxiv.org/abs/2002.11410v1
- Date: Wed, 26 Feb 2020 11:18:43 GMT
- Title: Efficient algorithms for multivariate shape-constrained convex
regression problems
- Authors: Meixia Lin, Defeng Sun, Kim-Chuan Toh
- Abstract summary: We prove that the least squares estimator is computable via solving a constrained convex programming (QP) problem with $(n+1)d$ variables and at least $n(n-1)$ linear inequality constraints.
For solving the generally very large-scale convex QP, we design two efficient algorithms, one is the symmetric Gauss-Seidel based alternating direction method of multipliers (tt sGS-ADMM), and the other is the proximal augmented Lagrangian method (tt pALM) with the subproblems solved by the semismooth Newton method (t
- Score: 9.281671380673306
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Shape-constrained convex regression problem deals with fitting a convex
function to the observed data, where additional constraints are imposed, such
as component-wise monotonicity and uniform Lipschitz continuity. This paper
provides a comprehensive mechanism for computing the least squares estimator of
a multivariate shape-constrained convex regression function in $\mathbb{R}^d$.
We prove that the least squares estimator is computable via solving a
constrained convex quadratic programming (QP) problem with $(n+1)d$ variables
and at least $n(n-1)$ linear inequality constraints, where $n$ is the number of
data points. For solving the generally very large-scale convex QP, we design
two efficient algorithms, one is the symmetric Gauss-Seidel based alternating
direction method of multipliers ({\tt sGS-ADMM}), and the other is the proximal
augmented Lagrangian method ({\tt pALM}) with the subproblems solved by the
semismooth Newton method ({\tt SSN}). Comprehensive numerical experiments,
including those in the pricing of basket options and estimation of production
functions in economics, demonstrate that both of our proposed algorithms
outperform the state-of-the-art algorithm. The {\tt pALM} is more efficient
than the {\tt sGS-ADMM} but the latter has the advantage of being simpler to
implement.
Related papers
- Projection by Convolution: Optimal Sample Complexity for Reinforcement Learning in Continuous-Space MDPs [56.237917407785545]
We consider the problem of learning an $varepsilon$-optimal policy in a general class of continuous-space Markov decision processes (MDPs) having smooth Bellman operators.
Key to our solution is a novel projection technique based on ideas from harmonic analysis.
Our result bridges the gap between two popular but conflicting perspectives on continuous-space MDPs.
arXiv Detail & Related papers (2024-05-10T09:58:47Z) - Stochastic Optimization for Non-convex Problem with Inexact Hessian
Matrix, Gradient, and Function [99.31457740916815]
Trust-region (TR) and adaptive regularization using cubics have proven to have some very appealing theoretical properties.
We show that TR and ARC methods can simultaneously provide inexact computations of the Hessian, gradient, and function values.
arXiv Detail & Related papers (2023-10-18T10:29:58Z) - Orthogonal Directions Constrained Gradient Method: from non-linear
equality constraints to Stiefel manifold [16.099883128428054]
We propose a novel algorithm, the Orthogonal Directions Constrained Method (ODCGM)
ODCGM only requires computing a projection onto a vector space.
We show that ODCGM exhibits the near-optimal oracle complexities.
arXiv Detail & Related papers (2023-03-16T12:25:53Z) - Stochastic Inexact Augmented Lagrangian Method for Nonconvex Expectation
Constrained Optimization [88.0031283949404]
Many real-world problems have complicated non functional constraints and use a large number of data points.
Our proposed method outperforms an existing method with the previously best-known result.
arXiv Detail & Related papers (2022-12-19T14:48:54Z) - Multi-block-Single-probe Variance Reduced Estimator for Coupled
Compositional Optimization [49.58290066287418]
We propose a novel method named Multi-block-probe Variance Reduced (MSVR) to alleviate the complexity of compositional problems.
Our results improve upon prior ones in several aspects, including the order of sample complexities and dependence on strongity.
arXiv Detail & Related papers (2022-07-18T12:03:26Z) - Statistical Inference of Constrained Stochastic Optimization via Sketched Sequential Quadratic Programming [53.63469275932989]
We consider online statistical inference of constrained nonlinear optimization problems.
We apply the Sequential Quadratic Programming (StoSQP) method to solve these problems.
arXiv Detail & Related papers (2022-05-27T00:34:03Z) - A stochastic linearized proximal method of multipliers for convex
stochastic optimization with expectation constraints [8.133190610747974]
We present a computable approximation type algorithm, namely the linearized proximal convex method of multipliers.
Some preliminary numerical results demonstrate the performance of the proposed algorithm.
arXiv Detail & Related papers (2021-06-22T07:24:17Z) - Hybrid Trilinear and Bilinear Programming for Aligning Partially
Overlapping Point Sets [85.71360365315128]
In many applications, we need algorithms which can align partially overlapping point sets are invariant to the corresponding corresponding RPM algorithm.
We first show that the objective is a cubic bound function. We then utilize the convex envelopes of trilinear and bilinear monomial transformations to derive its lower bound.
We next develop a branch-and-bound (BnB) algorithm which only branches over the transformation variables and runs efficiently.
arXiv Detail & Related papers (2021-01-19T04:24:23Z) - Ridge regression with adaptive additive rectangles and other piecewise
functional templates [0.0]
We propose an $L_2$-based penalization algorithm for functional linear regression models.
We show how our algorithm alternates between approximating a suitable template and solving a convex ridge-like problem.
arXiv Detail & Related papers (2020-11-02T15:28:54Z) - Hybrid Variance-Reduced SGD Algorithms For Nonconvex-Concave Minimax
Problems [26.24895953952318]
We develop an algorithm to solve a class of non-gence minimax problems.
They can also work with both single or two mini-batch derivatives.
arXiv Detail & Related papers (2020-06-27T03:05:18Z) - Subgradient Regularized Multivariate Convex Regression at Scale [21.55988846648753]
We present new large-scale algorithms for fitting a subgradient regularized convex regression function to $n$ samples in $d$ dimensions.
We show that our framework can approximately solve instances of the subgradient regularized convex regression problem with $n=105$ and $d=10$ within minutes.
arXiv Detail & Related papers (2020-05-23T19:08:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.