Nonsmooth Nonparametric Regression via Fractional Laplacian Eigenmaps
- URL: http://arxiv.org/abs/2402.14985v1
- Date: Thu, 22 Feb 2024 21:47:29 GMT
- Title: Nonsmooth Nonparametric Regression via Fractional Laplacian Eigenmaps
- Authors: Zhaoyang Shi, Krishnakumar Balasubramanian and Wolfgang Polonik
- Abstract summary: We develop nonparametric regression methods for the case when the true regression function is not necessarily smooth.
More specifically, our approach is using the fractional Laplacian and is designed to handle the case when the true regression function lies in an $L$-fractional Sobolev space with order $sin (0,1)$.
- Score: 15.738019181349992
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We develop nonparametric regression methods for the case when the true
regression function is not necessarily smooth. More specifically, our approach
is using the fractional Laplacian and is designed to handle the case when the
true regression function lies in an $L_2$-fractional Sobolev space with order
$s\in (0,1)$. This function class is a Hilbert space lying between the space of
square-integrable functions and the first-order Sobolev space consisting of
differentiable functions. It contains fractional power functions, piecewise
constant or polynomial functions and bump function as canonical examples. For
the proposed approach, we prove upper bounds on the in-sample mean-squared
estimation error of order $n^{-\frac{2s}{2s+d}}$, where $d$ is the dimension,
$s$ is the aforementioned order parameter and $n$ is the number of
observations. We also provide preliminary empirical results validating the
practical performance of the developed estimators.
Related papers
- Data subsampling for Poisson regression with pth-root-link [53.63838219437508]
We develop and analyze data subsampling techniques for Poisson regression.
In particular, we consider the Poisson generalized linear model with ID- and square root-link functions.
arXiv Detail & Related papers (2024-10-30T10:09:05Z) - Learning a Sparse Representation of Barron Functions with the Inverse
Scale Space Flow [3.249853429482705]
Given an $L2$ function $f$, the inverse scale space flow is used to find a sparse measure $mu$.
The convergence properties of this method are analysed in an ideal setting and in the cases of measurement noise and sampling bias.
arXiv Detail & Related papers (2023-12-05T11:26:02Z) - Kernel-based off-policy estimation without overlap: Instance optimality
beyond semiparametric efficiency [53.90687548731265]
We study optimal procedures for estimating a linear functional based on observational data.
For any convex and symmetric function class $mathcalF$, we derive a non-asymptotic local minimax bound on the mean-squared error.
arXiv Detail & Related papers (2023-01-16T02:57:37Z) - A first-order primal-dual method with adaptivity to local smoothness [64.62056765216386]
We consider the problem of finding a saddle point for the convex-concave objective $min_x max_y f(x) + langle Ax, yrangle - g*(y)$, where $f$ is a convex function with locally Lipschitz gradient and $g$ is convex and possibly non-smooth.
We propose an adaptive version of the Condat-Vu algorithm, which alternates between primal gradient steps and dual steps.
arXiv Detail & Related papers (2021-10-28T14:19:30Z) - Localization in 1D non-parametric latent space models from pairwise
affinities [6.982738885923206]
We consider the problem of estimating latent positions in a one-dimensional torus from pairwise affinities.
We introduce an estimation procedure that provably localizes all the latent positions with a maximum error of the order of $sqrtlog(n)/n$, with high-probability.
arXiv Detail & Related papers (2021-08-06T13:05:30Z) - Finding Global Minima via Kernel Approximations [90.42048080064849]
We consider the global minimization of smooth functions based solely on function evaluations.
In this paper, we consider an approach that jointly models the function to approximate and finds a global minimum.
arXiv Detail & Related papers (2020-12-22T12:59:30Z) - Truncated Linear Regression in High Dimensions [26.41623833920794]
In truncated linear regression, $(A_i, y_i)_i$ whose dependent variable equals $y_i= A_irm T cdot x* + eta_i$ is some fixed unknown vector of interest.
The goal is to recover $x*$ under some favorable conditions on the $A_i$'s and the noise distribution.
We prove that there exists a computationally and statistically efficient method for recovering $k$-sparse $n$-dimensional vectors $x*$ from $m$ truncated samples.
arXiv Detail & Related papers (2020-07-29T00:31:34Z) - Piecewise Linear Regression via a Difference of Convex Functions [50.89452535187813]
We present a new piecewise linear regression methodology that utilizes fitting a difference of convex functions (DC functions) to the data.
We empirically validate the method, showing it to be practically implementable, and to have comparable performance to existing regression/classification methods on real-world datasets.
arXiv Detail & Related papers (2020-07-05T18:58:47Z) - Tight Nonparametric Convergence Rates for Stochastic Gradient Descent
under the Noiseless Linear Model [0.0]
We analyze the convergence of single-pass, fixed step-size gradient descent on the least-square risk under this model.
As a special case, we analyze an online algorithm for estimating a real function on the unit interval from the noiseless observation of its value at randomly sampled points.
arXiv Detail & Related papers (2020-06-15T08:25:50Z) - A Random Matrix Analysis of Random Fourier Features: Beyond the Gaussian
Kernel, a Precise Phase Transition, and the Corresponding Double Descent [85.77233010209368]
This article characterizes the exacts of random Fourier feature (RFF) regression, in the realistic setting where the number of data samples $n$ is all large and comparable.
This analysis also provides accurate estimates of training and test regression errors for large $n,p,N$.
arXiv Detail & Related papers (2020-06-09T02:05:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.