MKLpy: a python-based framework for Multiple Kernel Learning
- URL: http://arxiv.org/abs/2007.09982v1
- Date: Mon, 20 Jul 2020 10:10:13 GMT
- Title: MKLpy: a python-based framework for Multiple Kernel Learning
- Authors: Ivano Lauriola and Fabio Aiolli
- Abstract summary: We introduce MKLpy, a python-based framework for Multiple Kernel Learning.
The library provides Multiple Kernel Learning algorithms for classification tasks, mechanisms to compute kernel functions for different data types, and evaluation strategies.
- Score: 4.670305538969914
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Multiple Kernel Learning is a recent and powerful paradigm to learn the
kernel function from data. In this paper, we introduce MKLpy, a python-based
framework for Multiple Kernel Learning. The library provides Multiple Kernel
Learning algorithms for classification tasks, mechanisms to compute kernel
functions for different data types, and evaluation strategies. The library is
meant to maximize the usability and to simplify the development of novel
solutions.
Related papers
- Snacks: a fast large-scale kernel SVM solver [0.8602553195689513]
Snacks is a new large-scale solver for Kernel Support Vector Machines.
Snacks relies on a Nystr"om approximation of the kernel matrix and an accelerated variant of the subgradient method.
arXiv Detail & Related papers (2023-04-17T04:19:20Z) - A new trigonometric kernel function for SVM [0.0]
We introduce a new trigonometric kernel function containing one parameter for the machine learning algorithms.
We also conduct an empirical evaluation on the kernel-SVM and kernel-SVR methods and demonstrate its strong performance.
arXiv Detail & Related papers (2022-10-16T17:10:52Z) - Fast Sketching of Polynomial Kernels of Polynomial Degree [61.83993156683605]
kernel is especially important as other kernels can often be approximated by the kernel via a Taylor series expansion.
Recent techniques in sketching reduce the dependence in the running time on the degree oblivious $q$ of the kernel.
We give a new sketch which greatly improves upon this running time, by removing the dependence on $q$ in the leading order term.
arXiv Detail & Related papers (2021-08-21T02:14:55Z) - Kernel Continual Learning [117.79080100313722]
kernel continual learning is a simple but effective variant of continual learning to tackle catastrophic forgetting.
episodic memory unit stores a subset of samples for each task to learn task-specific classifiers based on kernel ridge regression.
variational random features to learn a data-driven kernel for each task.
arXiv Detail & Related papers (2021-07-12T22:09:30Z) - MetaKernel: Learning Variational Random Features with Limited Labels [120.90737681252594]
Few-shot learning deals with the fundamental and challenging problem of learning from a few annotated samples, while being able to generalize well on new tasks.
We propose meta-learning kernels with random Fourier features for few-shot learning, we call Meta Kernel.
arXiv Detail & Related papers (2021-05-08T21:24:09Z) - Neural Generalization of Multiple Kernel Learning [2.064612766965483]
Multiple Kernel Learning is a conventional way to learn the kernel function in kernel-based methods.
Deep learning models can learn complex functions by applying nonlinear transformations to data through several layers.
We show that a typical MKL algorithm can be interpreted as a one-layer neural network with linear activation functions.
arXiv Detail & Related papers (2021-02-26T07:28:37Z) - Deep Learning Framework From Scratch Using Numpy [0.0]
This work is a rigorous development of a complete and general-purpose deep learning framework from the ground up.
The fundamental components of deep learning are developed from elementary calculus and implemented in a sensible object-oriented approach using only Python and the Numpy library.
Demonstrations of solved problems using the framework, named ArrayFlow, include a computer vision classification task, solving for the shape of a catenary, and a 2nd order differential equation.
arXiv Detail & Related papers (2020-11-17T06:28:05Z) - Learning Manifold Implicitly via Explicit Heat-Kernel Learning [63.354671267760516]
We propose the concept of implicit manifold learning, where manifold information is implicitly obtained by learning the associated heat kernel.
The learned heat kernel can be applied to various kernel-based machine learning models, including deep generative models (DGM) for data generation and Stein Variational Gradient Descent for Bayesian inference.
arXiv Detail & Related papers (2020-10-05T03:39:58Z) - Captum: A unified and generic model interpretability library for PyTorch [49.72749684393332]
We introduce a novel, unified, open-source model interpretability library for PyTorch.
The library contains generic implementations of a number of gradient and perturbation-based attribution algorithms.
It can be used for both classification and non-classification models.
arXiv Detail & Related papers (2020-09-16T18:57:57Z) - Kernel methods library for pattern analysis and machine learning in
python [0.0]
The kernelmethods library fills that important void in the python ML ecosystem in a domain-agnostic fashion.
The library provides a number of well-defined classes to make various kernel-based operations efficient.
arXiv Detail & Related papers (2020-05-27T16:44:42Z) - PolyScientist: Automatic Loop Transformations Combined with Microkernels
for Optimization of Deep Learning Primitives [55.79741270235602]
We develop a hybrid solution to the development of deep learning kernels.
We use the advanced polyhedral technology to automatically tune the outer loops for performance.
arXiv Detail & Related papers (2020-02-06T08:02:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.