Radial basis function kernel optimization for Support Vector Machine
classifiers
- URL: http://arxiv.org/abs/2007.08233v1
- Date: Thu, 16 Jul 2020 10:09:15 GMT
- Title: Radial basis function kernel optimization for Support Vector Machine
classifiers
- Authors: Karl Thurnhofer-Hemsi, Ezequiel L\'opez-Rubio, Miguel A.
Molina-Cabello, Kayvan Najarian
- Abstract summary: OKSVM is an algorithm that automatically learns the RBF kernel hyper parameter and adjusts the SVM weights simultaneously.
We analyze the performance of our approach with respect to the classical SVM for classification on synthetic and real data.
- Score: 4.888981184420116
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Support Vector Machines (SVMs) are still one of the most popular and precise
classifiers. The Radial Basis Function (RBF) kernel has been used in SVMs to
separate among classes with considerable success. However, there is an
intrinsic dependence on the initial value of the kernel hyperparameter. In this
work, we propose OKSVM, an algorithm that automatically learns the RBF kernel
hyperparameter and adjusts the SVM weights simultaneously. The proposed
optimization technique is based on a gradient descent method. We analyze the
performance of our approach with respect to the classical SVM for
classification on synthetic and real data. Experimental results show that OKSVM
performs better irrespective of the initial values of the RBF hyperparameter.
Related papers
- An Autotuning-based Optimization Framework for Mixed-kernel SVM Classifications in Smart Pixel Datasets and Heterojunction Transistors [0.0]
Support Vector Machine (SVM) is a state-of-the-art classification method widely used in science and engineering.
We propose an autotuning-based optimization framework to quantify the ranges of hyper parameters in SVMs to identify their optimal choices.
arXiv Detail & Related papers (2024-06-26T15:50:13Z) - MOKD: Cross-domain Finetuning for Few-shot Classification via Maximizing Optimized Kernel Dependence [97.93517982908007]
In cross-domain few-shot classification, NCC aims to learn representations to construct a metric space where few-shot classification can be performed.
In this paper, we find that there exist high similarities between NCC-learned representations of two samples from different classes.
We propose a bi-level optimization framework, emphmaximizing optimized kernel dependence (MOKD) to learn a set of class-specific representations that match the cluster structures indicated by labeled data.
arXiv Detail & Related papers (2024-05-29T05:59:52Z) - Separability and Scatteredness (S&S) Ratio-Based Efficient SVM
Regularization Parameter, Kernel, and Kernel Parameter Selection [10.66048003460524]
Support Vector Machine (SVM) is a robust machine learning algorithm with broad applications in classification, regression, and outlier detection.
This work shows that the SVM performance can be modeled as a function of separability and scatteredness (S&S) of the data.
arXiv Detail & Related papers (2023-05-17T13:51:43Z) - A new trigonometric kernel function for SVM [0.0]
We introduce a new trigonometric kernel function containing one parameter for the machine learning algorithms.
We also conduct an empirical evaluation on the kernel-SVM and kernel-SVR methods and demonstrate its strong performance.
arXiv Detail & Related papers (2022-10-16T17:10:52Z) - Handling Imbalanced Classification Problems With Support Vector Machines
via Evolutionary Bilevel Optimization [73.17488635491262]
Support vector machines (SVMs) are popular learning algorithms to deal with binary classification problems.
This article introduces EBCS-SVM: evolutionary bilevel cost-sensitive SVMs.
arXiv Detail & Related papers (2022-04-21T16:08:44Z) - Geometry-aware Bayesian Optimization in Robotics using Riemannian
Mat\'ern Kernels [64.62221198500467]
We show how to implement geometry-aware kernels for Bayesian optimization.
This technique can be used for control parameter tuning, parametric policy adaptation, and structure design in robotics.
arXiv Detail & Related papers (2021-11-02T09:47:22Z) - Estimating Average Treatment Effects with Support Vector Machines [77.34726150561087]
Support vector machine (SVM) is one of the most popular classification algorithms in the machine learning literature.
We adapt SVM as a kernel-based weighting procedure that minimizes the maximum mean discrepancy between the treatment and control groups.
We characterize the bias of causal effect estimation arising from this trade-off, connecting the proposed SVM procedure to the existing kernel balancing methods.
arXiv Detail & Related papers (2021-02-23T20:22:56Z) - Bilevel Optimization: Convergence Analysis and Enhanced Design [63.64636047748605]
Bilevel optimization is a tool for many machine learning problems.
We propose a novel stoc-efficientgradient estimator named stoc-BiO.
arXiv Detail & Related papers (2020-10-15T18:09:48Z) - A Semismooth-Newton's-Method-Based Linearization and Approximation
Approach for Kernel Support Vector Machines [1.177306187948666]
Support Vector Machines (SVMs) are among the most popular and the best performing classification algorithms.
In this paper, we propose a semismooth Newton's method based linearization approximation approach for kernel SVMs.
The advantage of the proposed approach is that it maintains low computational cost and keeps a fast convergence rate.
arXiv Detail & Related papers (2020-07-21T07:44:21Z) - On Coresets for Support Vector Machines [61.928187390362176]
A coreset is a small, representative subset of the original data points.
We show that our algorithm can be used to extend the applicability of any off-the-shelf SVM solver to streaming, distributed, and dynamic data settings.
arXiv Detail & Related papers (2020-02-15T23:25:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.