Convolutional Support Vector Machine
- URL: http://arxiv.org/abs/2002.07221v1
- Date: Tue, 11 Feb 2020 11:23:21 GMT
- Title: Convolutional Support Vector Machine
- Authors: Wei-Chang Yeh
- Abstract summary: This paper proposes a novel convolutional SVM (CSVM) that has both advantages of CNN and SVM to improve the accuracy and effectiveness of mining smaller datasets.
To evaluate the performance of the proposed CSVM, experiments were conducted to test five well-known benchmark databases for the classification problem.
- Score: 1.5990720051907859
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The support vector machine (SVM) and deep learning (e.g., convolutional
neural networks (CNNs)) are the two most famous algorithms in small and big
data, respectively. Nonetheless, smaller datasets may be very important,
costly, and not easy to obtain in a short time. This paper proposes a novel
convolutional SVM (CSVM) that has both advantages of CNN and SVM to improve the
accuracy and effectiveness of mining smaller datasets. The proposed CSVM adapts
the convolution product from CNN to learn new information hidden deeply in the
datasets. In addition, it uses a modified simplified swarm optimization (SSO)
to help train the CSVM to update classifiers, and then the traditional SVM is
implemented as the fitness for the SSO to estimate the accuracy. To evaluate
the performance of the proposed CSVM, experiments were conducted to test five
well-known benchmark databases for the classification problem. Numerical
experiments compared favorably with those obtained using SVM, 3-layer
artificial NN (ANN), and 4-layer ANN. The results of these experiments verify
that the proposed CSVM with the proposed SSO can effectively increase
classification accuracy.
Related papers
- Research on gesture recognition method based on SEDCNN-SVM [23.334616185686]
SEDCNN-SVM is proposed to recognize sEMG of different gestures.
SEDCNN-SVM consists of an improved deep convolutional neural network (DCNN) and a support vector machine (SVM)
arXiv Detail & Related papers (2024-10-24T09:02:56Z) - Large-Margin Representation Learning for Texture Classification [67.94823375350433]
This paper presents a novel approach combining convolutional layers (CLs) and large-margin metric learning for training supervised models on small datasets for texture classification.
The experimental results on texture and histopathologic image datasets have shown that the proposed approach achieves competitive accuracy with lower computational cost and faster convergence when compared to equivalent CNNs.
arXiv Detail & Related papers (2022-06-17T04:07:45Z) - Handling Imbalanced Classification Problems With Support Vector Machines
via Evolutionary Bilevel Optimization [73.17488635491262]
Support vector machines (SVMs) are popular learning algorithms to deal with binary classification problems.
This article introduces EBCS-SVM: evolutionary bilevel cost-sensitive SVMs.
arXiv Detail & Related papers (2022-04-21T16:08:44Z) - Efficient Cluster-Based k-Nearest-Neighbor Machine Translation [65.69742565855395]
k-Nearest-Neighbor Machine Translation (kNN-MT) has been recently proposed as a non-parametric solution for domain adaptation in neural machine translation (NMT)
arXiv Detail & Related papers (2022-04-13T05:46:31Z) - Learning from Small Samples: Transformation-Invariant SVMs with
Composition and Locality at Multiple Scales [11.210266084524998]
This paper shows how to incorporate into support-vector machines (SVMs) those properties that have made convolutional neural networks (CNNs) successful.
arXiv Detail & Related papers (2021-09-27T04:02:43Z) - Rank-R FNN: A Tensor-Based Learning Model for High-Order Data
Classification [69.26747803963907]
Rank-R Feedforward Neural Network (FNN) is a tensor-based nonlinear learning model that imposes Canonical/Polyadic decomposition on its parameters.
First, it handles inputs as multilinear arrays, bypassing the need for vectorization, and can thus fully exploit the structural information along every data dimension.
We establish the universal approximation and learnability properties of Rank-R FNN, and we validate its performance on real-world hyperspectral datasets.
arXiv Detail & Related papers (2021-04-11T16:37:32Z) - Unsupervised Real Time Prediction of Faults Using the Support Vector
Machine [1.1852751647387592]
We show that the proposed solution can perform much better when using the SMO training algorithm.
The classification performance of this predictive model is considerably better than the SVM with and without SMO training algorithm.
arXiv Detail & Related papers (2020-12-30T04:27:10Z) - Solving Mixed Integer Programs Using Neural Networks [57.683491412480635]
This paper applies learning to the two key sub-tasks of a MIP solver, generating a high-quality joint variable assignment, and bounding the gap in objective value between that assignment and an optimal one.
Our approach constructs two corresponding neural network-based components, Neural Diving and Neural Branching, to use in a base MIP solver such as SCIP.
We evaluate our approach on six diverse real-world datasets, including two Google production datasets and MIPLIB, by training separate neural networks on each.
arXiv Detail & Related papers (2020-12-23T09:33:11Z) - AML-SVM: Adaptive Multilevel Learning with Support Vector Machines [0.0]
This paper proposes an adaptive multilevel learning framework for the nonlinear SVM.
It improves the classification quality across the refinement process, and leverages multi-threaded parallel processing for better performance.
arXiv Detail & Related papers (2020-11-05T00:17:02Z) - On Coresets for Support Vector Machines [61.928187390362176]
A coreset is a small, representative subset of the original data points.
We show that our algorithm can be used to extend the applicability of any off-the-shelf SVM solver to streaming, distributed, and dynamic data settings.
arXiv Detail & Related papers (2020-02-15T23:25:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.