Lecture notes: Efficient approximation of kernel functions
- URL: http://arxiv.org/abs/2005.01566v1
- Date: Mon, 4 May 2020 15:30:06 GMT
- Title: Lecture notes: Efficient approximation of kernel functions
- Authors: Amitabha Bagchi
- Abstract summary: Notes endeavour to collect in one place the mathematical background required to understand the properties of kernels in general.
We briefly motivate the use of kernels in Machine Learning with the example of the support vector machine.
After a brief discussion of Hilbert spaces, including the Reproducing Kernel Hilbert Space construction, we present Mercer's theorem.
- Score: 4.177892889752434
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: These lecture notes endeavour to collect in one place the mathematical
background required to understand the properties of kernels in general and the
Random Fourier Features approximation of Rahimi and Recht (NIPS 2007) in
particular. We briefly motivate the use of kernels in Machine Learning with the
example of the support vector machine. We discuss positive definite and
conditionally negative definite kernels in some detail. After a brief
discussion of Hilbert spaces, including the Reproducing Kernel Hilbert Space
construction, we present Mercer's theorem. We discuss the Random Fourier
Features technique and then present, with proofs, scalar and matrix
concentration results that help us estimate the error incurred by the
technique. These notes are the transcription of 10 lectures given at IIT Delhi
between January and April 2020.
Related papers
- On the Approximation of Kernel functions [0.0]
The paper addresses approximations of the kernel itself.
For the Hilbert Gauss kernel on the unit cube, the paper establishes an upper bound of the associated eigenfunctions.
This improvement confirms low rank approximation methods such as the Nystr"om method.
arXiv Detail & Related papers (2024-03-11T13:50:07Z) - Reproducing kernel Hilbert spaces in the mean field limit [6.844996517347866]
kernels are function spaces generated by kernels, so called reproducing kernel Hilbert spaces.
We show the rigorous mean field limit of kernels and provide a detailed analysis of the limiting reproducing Hilbert space.
arXiv Detail & Related papers (2023-02-28T09:46:44Z) - Kernel Subspace and Feature Extraction [7.424262881242935]
We study kernel methods in machine learning from the perspective of feature subspace.
We construct a kernel from Hirschfeld--Gebelein--R'enyi maximal correlation functions, coined the maximal correlation kernel, and demonstrate its information-theoretic optimality.
arXiv Detail & Related papers (2023-01-04T02:46:11Z) - Unified Fourier-based Kernel and Nonlinearity Design for Equivariant
Networks on Homogeneous Spaces [52.424621227687894]
We introduce a unified framework for group equivariant networks on homogeneous spaces.
We take advantage of the sparsity of Fourier coefficients of the lifted feature fields.
We show that other methods treating features as the Fourier coefficients in the stabilizer subgroup are special cases of our activation.
arXiv Detail & Related papers (2022-06-16T17:59:01Z) - Experimental Design for Linear Functionals in Reproducing Kernel Hilbert
Spaces [102.08678737900541]
We provide algorithms for constructing bias-aware designs for linear functionals.
We derive non-asymptotic confidence sets for fixed and adaptive designs under sub-Gaussian noise.
arXiv Detail & Related papers (2022-05-26T20:56:25Z) - On the Benefits of Large Learning Rates for Kernel Methods [110.03020563291788]
We show that a phenomenon can be precisely characterized in the context of kernel methods.
We consider the minimization of a quadratic objective in a separable Hilbert space, and show that with early stopping, the choice of learning rate influences the spectral decomposition of the obtained solution.
arXiv Detail & Related papers (2022-02-28T13:01:04Z) - Kernel Continual Learning [117.79080100313722]
kernel continual learning is a simple but effective variant of continual learning to tackle catastrophic forgetting.
episodic memory unit stores a subset of samples for each task to learn task-specific classifiers based on kernel ridge regression.
variational random features to learn a data-driven kernel for each task.
arXiv Detail & Related papers (2021-07-12T22:09:30Z) - Reproducing Kernel Hilbert Space, Mercer's Theorem, Eigenfunctions,
Nystr\"om Method, and Use of Kernels in Machine Learning: Tutorial and Survey [5.967999555890417]
We start with reviewing the history of kernels in functional analysis and machine learning.
We introduce types of use of kernels in machine learning including kernel methods, kernel learning by semi-definite programming, Hilbert-Schmidt independence criterion, maximum mean discrepancy, kernel mean embedding, and kernel dimensionality reduction.
This paper can be useful for various fields of science including machine learning, dimensionality reduction, functional analysis in mathematics, and mathematical physics in quantum mechanics.
arXiv Detail & Related papers (2021-06-15T21:29:12Z) - Advanced Stationary and Non-Stationary Kernel Designs for Domain-Aware
Gaussian Processes [0.0]
We propose advanced kernel designs that only allow for functions with certain desirable characteristics to be elements of the reproducing kernel Hilbert space (RKHS)
We will show the impact of advanced kernel designs on Gaussian processes using several synthetic and two scientific data sets.
arXiv Detail & Related papers (2021-02-05T22:07:56Z) - Learning Set Functions that are Sparse in Non-Orthogonal Fourier Bases [73.53227696624306]
We present a new family of algorithms for learning Fourier-sparse set functions.
In contrast to other work that focused on the Walsh-Hadamard transform, our novel algorithms operate with recently introduced non-orthogonal Fourier transforms.
We demonstrate effectiveness on several real-world applications.
arXiv Detail & Related papers (2020-10-01T14:31:59Z) - Mat\'ern Gaussian processes on Riemannian manifolds [81.15349473870816]
We show how to generalize the widely-used Mat'ern class of Gaussian processes.
We also extend the generalization from the Mat'ern to the widely-used squared exponential process.
arXiv Detail & Related papers (2020-06-17T21:05:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.