Nonparametric, Nonasymptotic Confidence Bands with Paley-Wiener Kernels
for Band-Limited Functions
- URL: http://arxiv.org/abs/2206.13629v1
- Date: Mon, 27 Jun 2022 21:03:51 GMT
- Title: Nonparametric, Nonasymptotic Confidence Bands with Paley-Wiener Kernels
for Band-Limited Functions
- Authors: Bal\'azs Csan\'ad Cs\'aji, B\'alint Horv\'ath
- Abstract summary: The paper introduces a method to construct confidence bands for bounded, band-limited functions based on a finite sample of input-output pairs.
The approach is distribution-free w.r.t. the observation noises and only the knowledge of the input distribution is assumed.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The paper introduces a method to construct confidence bands for bounded,
band-limited functions based on a finite sample of input-output pairs. The
approach is distribution-free w.r.t. the observation noises and only the
knowledge of the input distribution is assumed. It is nonparametric, that is,
it does not require a parametric model of the regression function and the
regions have non-asymptotic guarantees. The algorithm is based on the theory of
Paley-Wiener reproducing kernel Hilbert spaces. The paper first studies the
fully observable variant, when there are no noises on the observations and only
the inputs are random; then it generalizes the ideas to the noisy case using
gradient-perturbation methods. Finally, numerical experiments demonstrating
both cases are presented.
Related papers
- Epistemic Uncertainty and Observation Noise with the Neural Tangent Kernel [12.464924018243988]
Recent work has shown that training wide neural networks with gradient descent is formally equivalent to computing the mean of the posterior distribution in a Gaussian Process.
We show how to deal with non-zero aleatoric noise and derive an estimator for the posterior covariance.
arXiv Detail & Related papers (2024-09-06T00:34:44Z) - A Unified Theory of Stochastic Proximal Point Methods without Smoothness [52.30944052987393]
Proximal point methods have attracted considerable interest owing to their numerical stability and robustness against imperfect tuning.
This paper presents a comprehensive analysis of a broad range of variations of the proximal point method (SPPM)
arXiv Detail & Related papers (2024-05-24T21:09:19Z) - Improving Kernel-Based Nonasymptotic Simultaneous Confidence Bands [0.0]
The paper studies the problem of constructing nonparametric simultaneous confidence bands with nonasymptotic and distribition-free guarantees.
The approach is based on the theory of Paley-Wiener kernel reproducing Hilbert spaces.
arXiv Detail & Related papers (2024-01-28T22:43:33Z) - Kernel-based off-policy estimation without overlap: Instance optimality
beyond semiparametric efficiency [53.90687548731265]
We study optimal procedures for estimating a linear functional based on observational data.
For any convex and symmetric function class $mathcalF$, we derive a non-asymptotic local minimax bound on the mean-squared error.
arXiv Detail & Related papers (2023-01-16T02:57:37Z) - Nonparametric Conditional Local Independence Testing [69.31200003384122]
Conditional local independence is an independence relation among continuous time processes.
No nonparametric test of conditional local independence has been available.
We propose such a nonparametric test based on double machine learning.
arXiv Detail & Related papers (2022-03-25T10:31:02Z) - Optimal policy evaluation using kernel-based temporal difference methods [78.83926562536791]
We use kernel Hilbert spaces for estimating the value function of an infinite-horizon discounted Markov reward process.
We derive a non-asymptotic upper bound on the error with explicit dependence on the eigenvalues of the associated kernel operator.
We prove minimax lower bounds over sub-classes of MRPs.
arXiv Detail & Related papers (2021-09-24T14:48:20Z) - Robust Uncertainty Bounds in Reproducing Kernel Hilbert Spaces: A Convex
Optimization Approach [9.462535418331615]
It is known that out-of-sample bounds can be established at unseen input locations.
We show how computing tight, finite-sample uncertainty bounds amounts to solving parametrically constrained linear programs.
arXiv Detail & Related papers (2021-04-19T19:27:52Z) - Towards Unbiased Random Features with Lower Variance For Stationary
Indefinite Kernels [26.57122949130266]
Our algorithm achieves lower variance and approximation error compared with the existing kernel approximation methods.
With better approximation to the originally selected kernels, improved classification accuracy and regression ability is obtained.
arXiv Detail & Related papers (2021-04-13T13:56:50Z) - Adversarial Estimation of Riesz Representers [21.510036777607397]
We propose an adversarial framework to estimate the Riesz representer using general function spaces.
We prove a nonasymptotic mean square rate in terms of an abstract quantity called the critical radius, then specialize it for neural networks, random forests, and reproducing kernel Hilbert spaces as leading cases.
arXiv Detail & Related papers (2020-12-30T19:46:57Z) - Uncertainty quantification for nonconvex tensor completion: Confidence
intervals, heteroscedasticity and optimality [92.35257908210316]
We study the problem of estimating a low-rank tensor given incomplete and corrupted observations.
We find that it attains unimprovable rates $ell-2$ accuracy.
arXiv Detail & Related papers (2020-06-15T17:47:13Z) - Nonparametric Score Estimators [49.42469547970041]
Estimating the score from a set of samples generated by an unknown distribution is a fundamental task in inference and learning of probabilistic models.
We provide a unifying view of these estimators under the framework of regularized nonparametric regression.
We propose score estimators based on iterative regularization that enjoy computational benefits from curl-free kernels and fast convergence.
arXiv Detail & Related papers (2020-05-20T15:01:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.