Face Verification via learning the kernel matrix
- URL: http://arxiv.org/abs/2001.07323v1
- Date: Tue, 21 Jan 2020 03:39:09 GMT
- Title: Face Verification via learning the kernel matrix
- Authors: Ning Yuan, Xiao-Jun Wu and He-Feng Yin
- Abstract summary: kernel function is introduced to solve the nonlinear pattern recognition problem.
A promising approach is to learn the kernel from data automatically.
In this paper, the nonlinear face verification via learning the kernel matrix is proposed.
- Score: 9.414572104591027
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The kernel function is introduced to solve the nonlinear pattern recognition
problem. The advantage of a kernel method often depends critically on a proper
choice of the kernel function. A promising approach is to learn the kernel from
data automatically. Over the past few years, some methods which have been
proposed to learn the kernel have some limitations: learning the parameters of
some prespecified kernel function and so on. In this paper, the nonlinear face
verification via learning the kernel matrix is proposed. A new criterion is
used in the new algorithm to avoid inverting the possibly singular within-class
which is a computational problem. The experimental results obtained on the
facial database XM2VTS using the Lausanne protocol show that the verification
performance of the new method is superior to that of the primary method Client
Specific Kernel Discriminant Analysis (CSKDA). The method CSKDA needs to choose
a proper kernel function through many experiments, while the new method could
learn the kernel from data automatically which could save a lot of time and
have the robust performance.
Related papers
- On the Sublinear Regret of GP-UCB [58.25014663727544]
We show that the Gaussian Process Upper Confidence Bound (GP-UCB) algorithm enjoys nearly optimal regret rates.
Our improvements rely on a key technical contribution -- regularizing kernel ridge estimators in proportion to the smoothness of the underlying kernel.
arXiv Detail & Related papers (2023-07-14T13:56:11Z) - A Simple Algorithm For Scaling Up Kernel Methods [0.0]
We introduce a novel random feature regression algorithm that allows us to scale to virtually infinite numbers of random features.
We illustrate the performance of our method on the CIFAR-10 dataset.
arXiv Detail & Related papers (2023-01-26T20:59:28Z) - A new trigonometric kernel function for SVM [0.0]
We introduce a new trigonometric kernel function containing one parameter for the machine learning algorithms.
We also conduct an empirical evaluation on the kernel-SVM and kernel-SVR methods and demonstrate its strong performance.
arXiv Detail & Related papers (2022-10-16T17:10:52Z) - Learning "best" kernels from data in Gaussian process regression. With
application to aerodynamics [0.4588028371034406]
We introduce algorithms to select/design kernels in Gaussian process regression/kriging surrogate modeling techniques.
A first class of algorithms is kernel flow, which was introduced in a context of classification in machine learning.
A second class of algorithms is called spectral kernel ridge regression, and aims at selecting a "best" kernel such that the norm of the function to be approximated is minimal.
arXiv Detail & Related papers (2022-06-03T07:50:54Z) - Meta-Learning Hypothesis Spaces for Sequential Decision-making [79.73213540203389]
We propose to meta-learn a kernel from offline data (Meta-KeL)
Under mild conditions, we guarantee that our estimated RKHS yields valid confidence sets.
We also empirically evaluate the effectiveness of our approach on a Bayesian optimization task.
arXiv Detail & Related papers (2022-02-01T17:46:51Z) - Kernel Continual Learning [117.79080100313722]
kernel continual learning is a simple but effective variant of continual learning to tackle catastrophic forgetting.
episodic memory unit stores a subset of samples for each task to learn task-specific classifiers based on kernel ridge regression.
variational random features to learn a data-driven kernel for each task.
arXiv Detail & Related papers (2021-07-12T22:09:30Z) - Taming Nonconvexity in Kernel Feature Selection---Favorable Properties
of the Laplace Kernel [77.73399781313893]
A challenge is to establish the objective function of kernel-based feature selection.
The gradient-based algorithms available for non-global optimization are only able to guarantee convergence to local minima.
arXiv Detail & Related papers (2021-06-17T11:05:48Z) - Kernel Identification Through Transformers [54.3795894579111]
Kernel selection plays a central role in determining the performance of Gaussian Process (GP) models.
This work addresses the challenge of constructing custom kernel functions for high-dimensional GP regression models.
We introduce a novel approach named KITT: Kernel Identification Through Transformers.
arXiv Detail & Related papers (2021-06-15T14:32:38Z) - Domain Adaptive Learning Based on Sample-Dependent and Learnable Kernels [2.1485350418225244]
This paper proposes a Domain Adaptive Learning method based on Sample-Dependent and Learnable Kernels (SDLK-DAL)
The first contribution of our work is to propose a sample-dependent and learnable Positive Definite Quadratic Kernel function (PDQK) framework.
We conduct a series of experiments that the RKHS determined by PDQK replaces those in several state-of-the-art DAL algorithms, and our approach achieves better performance.
arXiv Detail & Related papers (2021-02-18T13:55:06Z) - Isolation Distributional Kernel: A New Tool for Point & Group Anomaly
Detection [76.1522587605852]
Isolation Distributional Kernel (IDK) is a new way to measure the similarity between two distributions.
We demonstrate IDK's efficacy and efficiency as a new tool for kernel based anomaly detection for both point and group anomalies.
arXiv Detail & Related papers (2020-09-24T12:25:43Z) - End-to-end Kernel Learning via Generative Random Fourier Features [31.57596752889935]
Random Fourier features (RFFs) provide a promising way for kernel learning in a spectral case.
In this paper, we consider a one-stage process that incorporates the kernel learning and linear learner into a unifying framework.
arXiv Detail & Related papers (2020-09-10T00:27:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.