Graph-Embedded Subspace Support Vector Data Description
- URL: http://arxiv.org/abs/2104.14370v1
- Date: Thu, 29 Apr 2021 14:30:48 GMT
- Title: Graph-Embedded Subspace Support Vector Data Description
- Authors: Fahad Sohrab, Alexandros Iosifidis, Moncef Gabbouj, Jenni Raitoharju
- Abstract summary: We propose a novel subspace learning framework for one-class classification.
The proposed framework presents the problem in the form of graph embedding.
We demonstrate improved performance against the baselines and the recently proposed subspace learning methods for one-class classification.
- Score: 98.78559179013295
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we propose a novel subspace learning framework for one-class
classification. The proposed framework presents the problem in the form of
graph embedding. It includes the previously proposed subspace one-class
techniques as its special cases and provides further insight on what these
techniques actually optimize. The framework allows to incorporate other
meaningful optimization goals via the graph preserving criterion and reveals
spectral and spectral regression-based solutions as alternatives to the
previously used gradient-based technique. We combine the subspace learning
framework iteratively with Support Vector Data Description applied in the
subspace to formulate Graph-Embedded Subspace Support Vector Data Description.
We experimentally analyzed the performance of newly proposed different
variants. We demonstrate improved performance against the baselines and the
recently proposed subspace learning methods for one-class classification.
Related papers
- Newton Method-based Subspace Support Vector Data Description [16.772385337198834]
We present an adaptation of Newton's method for the optimization of Subspace Support Vector Data Description (S-SVDD)
We leverage Newton's method to enhance data mapping and data description for an improved optimization of subspace learning-based one-class classification.
The paper discusses the limitations of gradient descent and the advantages of using Newton's method in subspace learning for one-class classification tasks.
arXiv Detail & Related papers (2023-09-25T08:49:41Z) - Semi-Supervised Laplace Learning on Stiefel Manifolds [48.3427853588646]
We develop the framework Sequential Subspace for graph-based, supervised samples at low-label rates.
We achieves that our methods at extremely low rates, and high label rates.
arXiv Detail & Related papers (2023-07-31T20:19:36Z) - Symmetric Spaces for Graph Embeddings: A Finsler-Riemannian Approach [7.752212921476838]
We propose the systematic use of symmetric spaces in representation learning, a class encompassing many of the previously used embedding targets.
We develop a tool to analyze the embeddings and infer structural properties of the data sets.
Our approach outperforms competitive baselines for graph reconstruction tasks on various synthetic and real-world datasets.
arXiv Detail & Related papers (2021-06-09T09:33:33Z) - Auto-weighted Multi-view Feature Selection with Graph Optimization [90.26124046530319]
We propose a novel unsupervised multi-view feature selection model based on graph learning.
The contributions are threefold: (1) during the feature selection procedure, the consensus similarity graph shared by different views is learned.
Experiments on various datasets demonstrate the superiority of the proposed method compared with the state-of-the-art methods.
arXiv Detail & Related papers (2021-04-11T03:25:25Z) - Panoster: End-to-end Panoptic Segmentation of LiDAR Point Clouds [81.12016263972298]
We present Panoster, a novel proposal-free panoptic segmentation method for LiDAR point clouds.
Unlike previous approaches, Panoster proposes a simplified framework incorporating a learning-based clustering solution to identify instances.
At inference time, this acts as a class-agnostic segmentation, allowing Panoster to be fast, while outperforming prior methods in terms of accuracy.
arXiv Detail & Related papers (2020-10-28T18:10:20Z) - Rank-one partitioning: formalization, illustrative examples, and a new
cluster enhancing strategy [17.166794984161967]
We introduce and formalize a rank-one partitioning learning paradigm that unifies partitioning methods.
We propose a novel algorithmic solution for the partitioning problem based on rank-one matrix factorization and denoising of piecewise constant signals.
arXiv Detail & Related papers (2020-09-01T11:37:28Z) - There and Back Again: Revisiting Backpropagation Saliency Methods [87.40330595283969]
Saliency methods seek to explain the predictions of a model by producing an importance map across each input sample.
A popular class of such methods is based on backpropagating a signal and analyzing the resulting gradient.
We propose a single framework under which several such methods can be unified.
arXiv Detail & Related papers (2020-04-06T17:58:08Z) - Ellipsoidal Subspace Support Vector Data Description [98.67884574313292]
We propose a novel method for transforming data into a low-dimensional space optimized for one-class classification.
We provide both linear and non-linear formulations for the proposed method.
The proposed method is noticed to converge much faster than recently proposed Subspace Support Vector Data Description.
arXiv Detail & Related papers (2020-03-20T21:31:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.