Metrizing Weak Convergence with Maximum Mean Discrepancies
- URL: http://arxiv.org/abs/2006.09268v3
- Date: Fri, 3 Sep 2021 13:03:54 GMT
- Title: Metrizing Weak Convergence with Maximum Mean Discrepancies
- Authors: Carl-Johann Simon-Gabriel and Alessandro Barp and Bernhard Sch\"olkopf
and Lester Mackey
- Abstract summary: This paper characterizes the maximum mean discrepancies (MMD) that metrize the weak convergence of probability measures for a wide class of kernels.
We prove that, on a locally compact, non-compact, Hausdorff space, the MMD of a bounded continuous Borel measurable kernel k, metrizes the weak convergence of probability measures if and only if k is continuous.
- Score: 88.54422104669078
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper characterizes the maximum mean discrepancies (MMD) that metrize
the weak convergence of probability measures for a wide class of kernels. More
precisely, we prove that, on a locally compact, non-compact, Hausdorff space,
the MMD of a bounded continuous Borel measurable kernel k, whose reproducing
kernel Hilbert space (RKHS) functions vanish at infinity, metrizes the weak
convergence of probability measures if and only if k is continuous and
integrally strictly positive definite (i.s.p.d.) over all signed, finite,
regular Borel measures. We also correct a prior result of Simon-Gabriel &
Sch\"olkopf (JMLR, 2018, Thm.12) by showing that there exist both bounded
continuous i.s.p.d. kernels that do not metrize weak convergence and bounded
continuous non-i.s.p.d. kernels that do metrize it.
Related papers
- Hilbert's projective metric for functions of bounded growth and
exponential convergence of Sinkhorn's algorithm [1.6317061277457001]
We study versions of Hilbert's projective metric for spaces of integrable functions of bounded growth.
We show that kernel integral operators are contractions with respect to suitable specifications of such metrics.
As an application to entropic optimal transport, we show exponential convergence of Sinkhorn's algorithm in settings where the marginal distributions have sufficiently light tails.
arXiv Detail & Related papers (2023-11-07T14:53:23Z) - Approximation of optimization problems with constraints through kernel
Sum-Of-Squares [77.27820145069515]
We show that pointwise inequalities are turned into equalities within a class of nonnegative kSoS functions.
We also show that focusing on pointwise equality constraints enables the use of scattering inequalities to mitigate the curse of dimensionality in sampling the constraints.
arXiv Detail & Related papers (2023-01-16T10:30:04Z) - A lower confidence sequence for the changing mean of non-negative right
heavy-tailed observations with bounded mean [9.289846887298854]
A confidence sequence produces an adapted sequence of sets for a predictable parameter sequence with a time-parametric coverage guarantee.
This work constructs a non-asymptotic lower CS for the running average conditional expectation whose slack converges to zero.
arXiv Detail & Related papers (2022-10-20T09:50:05Z) - Targeted Separation and Convergence with Kernel Discrepancies [61.973643031360254]
kernel-based discrepancy measures are required to (i) separate a target P from other probability measures or (ii) control weak convergence to P.
In this article we derive new sufficient and necessary conditions to ensure (i) and (ii)
For MMDs on separable metric spaces, we characterize those kernels that separate Bochner embeddable measures and introduce simple conditions for separating all measures with unbounded kernels.
arXiv Detail & Related papers (2022-09-26T16:41:16Z) - A Note on Optimizing Distributions using Kernel Mean Embeddings [94.96262888797257]
Kernel mean embeddings represent probability measures by their infinite-dimensional mean embeddings in a reproducing kernel Hilbert space.
We show that when the kernel is characteristic, distributions with a kernel sum-of-squares density are dense.
We provide algorithms to optimize such distributions in the finite-sample setting.
arXiv Detail & Related papers (2021-06-18T08:33:45Z) - A Class of Dimension-free Metrics for the Convergence of Empirical
Measures [6.253771639590562]
We show that under the proposed metrics, the convergence of empirical measures in high dimensions is free of the curse of dimensionality (CoD)
Examples of selected test function spaces include the kernel reproducing Hilbert spaces, Barron space, and flow-induced function spaces.
We show that the proposed class of metrics is a powerful tool to analyze the convergence of empirical measures in high dimensions without CoD.
arXiv Detail & Related papers (2021-04-24T23:27:40Z) - Strong Uniform Consistency with Rates for Kernel Density Estimators with
General Kernels on Manifolds [11.927892660941643]
We show how to handle kernel density estimation with intricate kernels not designed by the user.
The isotropic kernels considered in this paper are different from the kernels in the Vapnik-Chervonenkis class that are frequently considered in statistics society.
arXiv Detail & Related papers (2020-07-13T14:36:06Z) - The Convergence Indicator: Improved and completely characterized
parameter bounds for actual convergence of Particle Swarm Optimization [68.8204255655161]
We introduce a new convergence indicator that can be used to calculate whether the particles will finally converge to a single point or diverge.
Using this convergence indicator we provide the actual bounds completely characterizing parameter regions that lead to a converging swarm.
arXiv Detail & Related papers (2020-06-06T19:08:05Z) - On Linear Stochastic Approximation: Fine-grained Polyak-Ruppert and
Non-Asymptotic Concentration [115.1954841020189]
We study the inequality and non-asymptotic properties of approximation procedures with Polyak-Ruppert averaging.
We prove a central limit theorem (CLT) for the averaged iterates with fixed step size and number of iterations going to infinity.
arXiv Detail & Related papers (2020-04-09T17:54:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.