Spectrahedral Regression
- URL: http://arxiv.org/abs/2110.14779v1
- Date: Wed, 27 Oct 2021 21:21:19 GMT
- Title: Spectrahedral Regression
- Authors: Eliza O'Reilly and Venkat Chandrasekaran
- Abstract summary: We present a new approach to the problem of fitting a convex function to a data set consisting of input-output pairs.
We show that we can approximate convex functions of statistical risk statistical analysis.
We demonstrate our approach with experiments synthetic data sets as well as real data in applications such as economics engineering design.
- Score: 7.2361978133966165
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Convex regression is the problem of fitting a convex function to a data set
consisting of input-output pairs. We present a new approach to this problem
called spectrahedral regression, in which we fit a spectrahedral function to
the data, i.e. a function that is the maximum eigenvalue of an affine matrix
expression of the input. This method represents a significant generalization of
polyhedral (also called max-affine) regression, in which a polyhedral function
(a maximum of a fixed number of affine functions) is fit to the data. We prove
bounds on how well spectrahedral functions can approximate arbitrary convex
functions via statistical risk analysis. We also analyze an alternating
minimization algorithm for the non-convex optimization problem of fitting the
best spectrahedral function to a given data set. We show that this algorithm
converges geometrically with high probability to a small ball around the
optimal parameter given a good initialization. Finally, we demonstrate the
utility of our approach with experiments on synthetic data sets as well as real
data arising in applications such as economics and engineering design.
Related papers
- Highly Adaptive Ridge [84.38107748875144]
We propose a regression method that achieves a $n-2/3$ dimension-free L2 convergence rate in the class of right-continuous functions with square-integrable sectional derivatives.
Har is exactly kernel ridge regression with a specific data-adaptive kernel based on a saturated zero-order tensor-product spline basis expansion.
We demonstrate empirical performance better than state-of-the-art algorithms for small datasets in particular.
arXiv Detail & Related papers (2024-10-03T17:06:06Z) - A Sample Efficient Alternating Minimization-based Algorithm For Robust Phase Retrieval [56.67706781191521]
In this work, we present a robust phase retrieval problem where the task is to recover an unknown signal.
Our proposed oracle avoids the need for computationally spectral descent, using a simple gradient step and outliers.
arXiv Detail & Related papers (2024-09-07T06:37:23Z) - Kernel-based off-policy estimation without overlap: Instance optimality
beyond semiparametric efficiency [53.90687548731265]
We study optimal procedures for estimating a linear functional based on observational data.
For any convex and symmetric function class $mathcalF$, we derive a non-asymptotic local minimax bound on the mean-squared error.
arXiv Detail & Related papers (2023-01-16T02:57:37Z) - Max-Linear Regression by Convex Programming [5.366354612549172]
We formulate and analyze a scalable convex program given by anchored regression (AR) as the estimator for the max-linear regression problem.
Our result shows a sufficient number of noise-free observations for exact recovery scales as $k4p$ up to a logarithmic factor.
arXiv Detail & Related papers (2021-03-12T00:55:54Z) - Ridge regression with adaptive additive rectangles and other piecewise
functional templates [0.0]
We propose an $L_2$-based penalization algorithm for functional linear regression models.
We show how our algorithm alternates between approximating a suitable template and solving a convex ridge-like problem.
arXiv Detail & Related papers (2020-11-02T15:28:54Z) - Piecewise Linear Regression via a Difference of Convex Functions [50.89452535187813]
We present a new piecewise linear regression methodology that utilizes fitting a difference of convex functions (DC functions) to the data.
We empirically validate the method, showing it to be practically implementable, and to have comparable performance to existing regression/classification methods on real-world datasets.
arXiv Detail & Related papers (2020-07-05T18:58:47Z) - Tight Nonparametric Convergence Rates for Stochastic Gradient Descent
under the Noiseless Linear Model [0.0]
We analyze the convergence of single-pass, fixed step-size gradient descent on the least-square risk under this model.
As a special case, we analyze an online algorithm for estimating a real function on the unit interval from the noiseless observation of its value at randomly sampled points.
arXiv Detail & Related papers (2020-06-15T08:25:50Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z) - Improved guarantees and a multiple-descent curve for Column Subset
Selection and the Nystr\"om method [76.73096213472897]
We develop techniques which exploit spectral properties of the data matrix to obtain improved approximation guarantees.
Our approach leads to significantly better bounds for datasets with known rates of singular value decay.
We show that both our improved bounds and the multiple-descent curve can be observed on real datasets simply by varying the RBF parameter.
arXiv Detail & Related papers (2020-02-21T00:43:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.