On kernel-based statistical learning in the mean field limit
- URL: http://arxiv.org/abs/2310.18074v1
- Date: Fri, 27 Oct 2023 11:42:56 GMT
- Title: On kernel-based statistical learning in the mean field limit
- Authors: Christian Fiedler, Michael Herty, Sebastian Trimpe
- Abstract summary: In many applications of machine learning, a large number of variables are considered.
We consider the situation when the number of input variables goes to infinity.
In particular, we show mean field convergence of empirical and infinite-sample solutions.
- Score: 7.2494787805712395
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In many applications of machine learning, a large number of variables are
considered. Motivated by machine learning of interacting particle systems, we
consider the situation when the number of input variables goes to infinity.
First, we continue the recent investigation of the mean field limit of kernels
and their reproducing kernel Hilbert spaces, completing the existing theory.
Next, we provide results relevant for approximation with such kernels in the
mean field limit, including a representer theorem. Finally, we use these
kernels in the context of statistical learning in the mean field limit,
focusing on Support Vector Machines. In particular, we show mean field
convergence of empirical and infinite-sample solutions as well as the
convergence of the corresponding risks. On the one hand, our results establish
rigorous mean field limits in the context of kernel methods, providing new
theoretical tools and insights for large-scale problems. On the other hand, our
setting corresponds to a new form of limit of learning problems, which seems to
have not been investigated yet in the statistical learning theory literature.
Related papers
- Estimation of mutual information via quantum kernel method [0.0]
Estimating mutual information (MI) plays a critical role to investigate the relationship among multiple random variables with a nonlinear correlation.
We propose a method for estimating mutual information using the quantum kernel.
arXiv Detail & Related papers (2023-10-19T00:53:16Z) - Improved learning theory for kernel distribution regression with
two-stage sampling [3.154269505086155]
kernel methods have become a method of choice to tackle the distribution regression problem.
We introduce the novel near-unbiased condition on the Hilbertian embeddings, that enables us to provide new error bounds.
We show that this near-unbiased condition holds for three important classes of kernels, based on optimal transport and mean embedding.
arXiv Detail & Related papers (2023-08-28T06:29:09Z) - Higher-order topological kernels via quantum computation [68.8204255655161]
Topological data analysis (TDA) has emerged as a powerful tool for extracting meaningful insights from complex data.
We propose a quantum approach to defining Betti kernels, which is based on constructing Betti curves with increasing order.
arXiv Detail & Related papers (2023-07-14T14:48:52Z) - Reproducing kernel Hilbert spaces in the mean field limit [6.844996517347866]
kernels are function spaces generated by kernels, so called reproducing kernel Hilbert spaces.
We show the rigorous mean field limit of kernels and provide a detailed analysis of the limiting reproducing Hilbert space.
arXiv Detail & Related papers (2023-02-28T09:46:44Z) - Kernelized Cumulants: Beyond Kernel Mean Embeddings [11.448622437140022]
We extend cumulants to reproducing kernel Hilbert spaces (RKHS) using tools from tensor algebras.
We argue that going beyond degree one has several advantages and can be achieved with the same computational complexity and minimal overhead.
arXiv Detail & Related papers (2023-01-29T15:31:06Z) - Interpolation with the polynomial kernels [5.8720142291102135]
kernels are widely used in machine learning and they are one of the default choices to develop kernel-based regression models.
They are rarely used and considered in numerical analysis due to their lack of strict positive definiteness.
This paper is devoted to establish some initial results for the study of these kernels, and their related algorithms.
arXiv Detail & Related papers (2022-12-15T08:30:23Z) - Noisy Quantum Kernel Machines [58.09028887465797]
An emerging class of quantum learning machines is that based on the paradigm of quantum kernels.
We study how dissipation and decoherence affect their performance.
We show that decoherence and dissipation can be seen as an implicit regularization for the quantum kernel machines.
arXiv Detail & Related papers (2022-04-26T09:52:02Z) - Minimax Optimization: The Case of Convex-Submodular [50.03984152441271]
Minimax problems extend beyond the continuous domain to mixed continuous-discrete domains or even fully discrete domains.
We introduce the class of convex-submodular minimax problems, where the objective is convex with respect to the continuous variable and submodular with respect to the discrete variable.
Our proposed algorithms are iterative and combine tools from both discrete and continuous optimization.
arXiv Detail & Related papers (2021-11-01T21:06:35Z) - Kernel Mean Estimation by Marginalized Corrupted Distributions [96.9272743070371]
Estimating the kernel mean in a kernel Hilbert space is a critical component in many kernel learning algorithms.
We present a new kernel mean estimator, called the marginalized kernel mean estimator, which estimates kernel mean under the corrupted distribution.
arXiv Detail & Related papers (2021-07-10T15:11:28Z) - Constrained Learning with Non-Convex Losses [119.8736858597118]
Though learning has become a core technology of modern information processing, there is now ample evidence that it can lead to biased, unsafe, and prejudiced solutions.
arXiv Detail & Related papers (2021-03-08T23:10:33Z) - Fundamental Limits and Tradeoffs in Invariant Representation Learning [99.2368462915979]
Many machine learning applications involve learning representations that achieve two competing goals.
Minimax game-theoretic formulation represents a fundamental tradeoff between accuracy and invariance.
We provide an information-theoretic analysis of this general and important problem under both classification and regression settings.
arXiv Detail & Related papers (2020-12-19T15:24:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.