On regularized Radon-Nikodym differentiation
- URL: http://arxiv.org/abs/2308.07887v1
- Date: Tue, 15 Aug 2023 17:27:16 GMT
- Title: On regularized Radon-Nikodym differentiation
- Authors: Duc Hoan Nguyen and Werner Zellinger and Sergei V. Pereverzyev
- Abstract summary: We discuss the problem of estimating Radon-Nikodym derivatives.
We employ the general regularization scheme in reproducing kernel Hilbert spaces.
We find that the reconstruction of Radon-Nikodym derivatives at any particular point can be done with high order of accuracy.
- Score: 3.047411947074805
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We discuss the problem of estimating Radon-Nikodym derivatives. This problem
appears in various applications, such as covariate shift adaptation,
likelihood-ratio testing, mutual information estimation, and conditional
probability estimation. To address the above problem, we employ the general
regularization scheme in reproducing kernel Hilbert spaces. The convergence
rate of the corresponding regularized algorithm is established by taking into
account both the smoothness of the derivative and the capacity of the space in
which it is estimated. This is done in terms of general source conditions and
the regularized Christoffel functions. We also find that the reconstruction of
Radon-Nikodym derivatives at any particular point can be done with high order
of accuracy. Our theoretical results are illustrated by numerical simulations.
Related papers
- Bayesian Renormalization [68.8204255655161]
We present a fully information theoretic approach to renormalization inspired by Bayesian statistical inference.
The main insight of Bayesian Renormalization is that the Fisher metric defines a correlation length that plays the role of an emergent RG scale.
We provide insight into how the Bayesian Renormalization scheme relates to existing methods for data compression and data generation.
arXiv Detail & Related papers (2023-05-17T18:00:28Z) - Semi-parametric inference based on adaptively collected data [34.56133468275712]
We construct suitably weighted estimating equations that account for adaptivity in data collection.
Our results characterize the degree of "explorability" required for normality to hold.
We illustrate our general theory with concrete consequences for various problems, including standard linear bandits and sparse generalized bandits.
arXiv Detail & Related papers (2023-03-05T00:45:32Z) - Statistical Inverse Problems in Hilbert Scales [0.0]
We study the Tikhonov regularization scheme in Hilbert scales for the nonlinear statistical inverse problem with a general noise.
The regularizing norm in this scheme is stronger than the norm in Hilbert space.
arXiv Detail & Related papers (2022-08-28T21:06:05Z) - Experimental Design for Linear Functionals in Reproducing Kernel Hilbert
Spaces [102.08678737900541]
We provide algorithms for constructing bias-aware designs for linear functionals.
We derive non-asymptotic confidence sets for fixed and adaptive designs under sub-Gaussian noise.
arXiv Detail & Related papers (2022-05-26T20:56:25Z) - Nonconvex Stochastic Scaled-Gradient Descent and Generalized Eigenvector
Problems [98.34292831923335]
Motivated by the problem of online correlation analysis, we propose the emphStochastic Scaled-Gradient Descent (SSD) algorithm.
We bring these ideas together in an application to online correlation analysis, deriving for the first time an optimal one-time-scale algorithm with an explicit rate of local convergence to normality.
arXiv Detail & Related papers (2021-12-29T18:46:52Z) - Uniform Function Estimators in Reproducing Kernel Hilbert Spaces [0.0]
This paper addresses the problem of regression to reconstruct functions, which are observed with superimposed errors at random locations.
It is demonstrated that the estimator, which is often derived by employing Gaussian random fields, converges in the mean norm of the kernel reproducing Hilbert space to the conditional expectation.
arXiv Detail & Related papers (2021-08-16T08:13:28Z) - The Connection between Discrete- and Continuous-Time Descriptions of
Gaussian Continuous Processes [60.35125735474386]
We show that discretizations yielding consistent estimators have the property of invariance under coarse-graining'
This result explains why combining differencing schemes for derivatives reconstruction and local-in-time inference approaches does not work for time series analysis of second or higher order differential equations.
arXiv Detail & Related papers (2021-01-16T17:11:02Z) - Local optimization on pure Gaussian state manifolds [63.76263875368856]
We exploit insights into the geometry of bosonic and fermionic Gaussian states to develop an efficient local optimization algorithm.
The method is based on notions of descent gradient attuned to the local geometry.
We use the presented methods to collect numerical and analytical evidence for the conjecture that Gaussian purifications are sufficient to compute the entanglement of purification of arbitrary mixed Gaussian states.
arXiv Detail & Related papers (2020-09-24T18:00:36Z) - Faster Wasserstein Distance Estimation with the Sinkhorn Divergence [0.0]
The squared Wasserstein distance is a quantity to compare probability distributions in a non-parametric setting.
In this work, we propose instead to estimate it with the Sinkhorn divergence.
We show that, for smooth densities, this estimator has a comparable sample complexity but allows higher regularization levels.
arXiv Detail & Related papers (2020-06-15T06:58:16Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.