Information-Theoretic Bounds for Integral Estimation
- URL: http://arxiv.org/abs/2102.10199v1
- Date: Fri, 19 Feb 2021 23:07:26 GMT
- Title: Information-Theoretic Bounds for Integral Estimation
- Authors: Donald Q. Adams and Adarsh Barik and Jean Honorio
- Abstract summary: In this paper, we consider a zero-order oracle model of estimating definite integrals.
We find that the information-theoretic error lower bound for estimating the integral of a $d$-dimensional function is $Omega (2d rd+1sqrtd/T)$.
- Score: 27.243424330309608
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we consider a zero-order stochastic oracle model of estimating
definite integrals. In this model, integral estimation methods may query an
oracle function for a fixed number of noisy values of the integrand function
and use these values to produce an estimate of the integral. We first show that
the information-theoretic error lower bound for estimating the integral of a
$d$-dimensional function over a region with $l_\infty$ radius $r$ using at most
$T$ queries to the oracle function is $\Omega(2^d r^{d+1}\sqrt{d/T})$.
Additionally, we find that the Gaussian Quadrature method under the same model
achieves a rate of $O(2^{d}r^d/\sqrt{T})$ for functions with zero fourth and
higher-order derivatives with respect to individual dimensions, and for
Gaussian oracles, this rate is tight. For functions with nonzero fourth
derivatives, the Gaussian Quadrature method achieves an upper bound which is
not tight with the information-theoretic lower bound. Therefore, it is not
minimax optimal, so there is space for the development of better integral
estimation methods for such functions.
Related papers
- Fast Convergence for High-Order ODE Solvers in Diffusion Probabilistic Models [5.939858158928473]
Diffusion probabilistic models generate samples by learning to reverse a noise-injection process that transforms data into noise.<n>Reformulating this reverse process as a deterministic probability flow ordinary differential equation (ODE) enables efficient sampling using high-order solvers.<n>Since the score function is typically approximated by a neural network, analyzing the interaction between its regularity, approximation error, and numerical integration error is key to understanding the overall sampling accuracy.
arXiv Detail & Related papers (2025-06-16T03:09:25Z) - Learning a Sparse Representation of Barron Functions with the Inverse
Scale Space Flow [3.249853429482705]
Given an $L2$ function $f$, the inverse scale space flow is used to find a sparse measure $mu$.
The convergence properties of this method are analysed in an ideal setting and in the cases of measurement noise and sampling bias.
arXiv Detail & Related papers (2023-12-05T11:26:02Z) - Efficient Numerical Integration in Reproducing Kernel Hilbert Spaces via
Leverage Scores Sampling [16.992480926905067]
We consider the problem of approximating integrals with respect to a target probability measure using only pointwise evaluations of the integrand.
We propose an efficient procedure which exploits a small i.i.d. random subset of $mn$ samples drawn either uniformly or using approximate leverage scores from the initial observations.
arXiv Detail & Related papers (2023-11-22T17:44:18Z) - Kernel-based off-policy estimation without overlap: Instance optimality
beyond semiparametric efficiency [53.90687548731265]
We study optimal procedures for estimating a linear functional based on observational data.
For any convex and symmetric function class $mathcalF$, we derive a non-asymptotic local minimax bound on the mean-squared error.
arXiv Detail & Related papers (2023-01-16T02:57:37Z) - Extra-Newton: A First Approach to Noise-Adaptive Accelerated
Second-Order Methods [57.050204432302195]
This work proposes a universal and adaptive second-order method for minimizing second-order smooth, convex functions.
Our algorithm achieves $O(sigma / sqrtT)$ convergence when the oracle feedback is with variance $sigma2$, and improves its convergence to $O( 1 / T3)$ with deterministic oracles.
arXiv Detail & Related papers (2022-11-03T14:12:51Z) - Finding Global Minima via Kernel Approximations [90.42048080064849]
We consider the global minimization of smooth functions based solely on function evaluations.
In this paper, we consider an approach that jointly models the function to approximate and finds a global minimum.
arXiv Detail & Related papers (2020-12-22T12:59:30Z) - Fourier Sparse Leverage Scores and Approximate Kernel Learning [29.104055676527913]
We prove new explicit upper bounds on the leverage scores of Fourier functions under both the Gaussian and Laplace measures.
Our bounds are motivated by two important applications in machine learning.
arXiv Detail & Related papers (2020-06-12T17:25:39Z) - On Suboptimality of Least Squares with Application to Estimation of
Convex Bodies [74.39616164169131]
We settle an open problem regarding optimality of Least Squares in estimating a convex set from noisy support function measurements in dimension $dgeq 6$.
We establish that Least Squares is sub-optimal, and achieves a rate of $tildeTheta_d(n-2/(d-1))$ whereas the minimax rate is $Theta_d(n-4/(d+3))$.
arXiv Detail & Related papers (2020-06-07T05:19:00Z) - Reinforcement Learning with General Value Function Approximation:
Provably Efficient Approach via Bounded Eluder Dimension [124.7752517531109]
We establish a provably efficient reinforcement learning algorithm with general value function approximation.
We show that our algorithm achieves a regret bound of $widetildeO(mathrmpoly(dH)sqrtT)$ where $d$ is a complexity measure.
Our theory generalizes recent progress on RL with linear value function approximation and does not make explicit assumptions on the model of the environment.
arXiv Detail & Related papers (2020-05-21T17:36:09Z) - Spectral density estimation with the Gaussian Integral Transform [91.3755431537592]
spectral density operator $hatrho(omega)=delta(omega-hatH)$ plays a central role in linear response theory.
We describe a near optimal quantum algorithm providing an approximation to the spectral density.
arXiv Detail & Related papers (2020-04-10T03:14:38Z) - Information-Theoretic Lower Bounds for Zero-Order Stochastic Gradient
Estimation [28.776950569604026]
We analyze the necessary number of samples to estimate the gradient of any multidimensional smooth function in a zero-order oracle model.
We show that the finite difference method for a bounded-variance oracle has $O(d4/3/sqrtT)$ for functions with zero third higher derivatives.
arXiv Detail & Related papers (2020-03-31T00:11:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.