Manifold Partition Discriminant Analysis
- URL: http://arxiv.org/abs/2011.11521v1
- Date: Mon, 23 Nov 2020 16:33:23 GMT
- Title: Manifold Partition Discriminant Analysis
- Authors: Yang Zhou and Shiliang Sun
- Abstract summary: We propose a novel algorithm for supervised dimensionality reduction named Manifold Partition Discriminant Analysis (MPDA)
It aims to find a linear embedding space where the within-class similarity is achieved along the direction that is consistent with the local variation of the data manifold.
MPDA explicitly parameterizes the connections of tangent spaces and represents the data manifold in a piecewise manner.
- Score: 42.11470531267327
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a novel algorithm for supervised dimensionality reduction named
Manifold Partition Discriminant Analysis (MPDA). It aims to find a linear
embedding space where the within-class similarity is achieved along the
direction that is consistent with the local variation of the data manifold,
while nearby data belonging to different classes are well separated. By
partitioning the data manifold into a number of linear subspaces and utilizing
the first-order Taylor expansion, MPDA explicitly parameterizes the connections
of tangent spaces and represents the data manifold in a piecewise manner. While
graph Laplacian methods capture only the pairwise interaction between data
points, our method capture both pairwise and higher order interactions (using
regional consistency) between data points. This manifold representation can
help to improve the measure of within-class similarity, which further leads to
improved performance of dimensionality reduction. Experimental results on
multiple real-world data sets demonstrate the effectiveness of the proposed
method.
Related papers
- Distributional Reduction: Unifying Dimensionality Reduction and Clustering with Gromov-Wasserstein [56.62376364594194]
Unsupervised learning aims to capture the underlying structure of potentially large and high-dimensional datasets.
In this work, we revisit these approaches under the lens of optimal transport and exhibit relationships with the Gromov-Wasserstein problem.
This unveils a new general framework, called distributional reduction, that recovers DR and clustering as special cases and allows addressing them jointly within a single optimization problem.
arXiv Detail & Related papers (2024-02-03T19:00:19Z) - Unsupervised Manifold Alignment with Joint Multidimensional Scaling [4.683612295430957]
We introduce Joint Multidimensional Scaling, which maps datasets from two different domains to a common low-dimensional Euclidean space.
Our approach integrates Multidimensional Scaling (MDS) and Wasserstein Procrustes analysis into a joint optimization problem.
We demonstrate the effectiveness of our approach in several applications, including joint visualization of two datasets, unsupervised heterogeneous domain adaptation, graph matching, and protein structure alignment.
arXiv Detail & Related papers (2022-07-06T21:02:42Z) - Autonomous Dimension Reduction by Flattening Deformation of Data
Manifold under an Intrinsic Deforming Field [0.0]
A new dimension reduction (DR) method for data sets is proposed by autonomous deforming of data manifold.
The flattening of data manifold is achieved as an emergent behavior under the elastic and repelling interactions between data points.
The proposed method provides a novel geometric viewpoint on dimension reduction.
arXiv Detail & Related papers (2021-10-21T07:20:23Z) - Inferring Manifolds From Noisy Data Using Gaussian Processes [17.166283428199634]
Most existing manifold learning algorithms replace the original data with lower dimensional coordinates.
This article proposes a new methodology for addressing these problems, allowing the estimated manifold between fitted data points.
arXiv Detail & Related papers (2021-10-14T15:50:38Z) - Kernel Two-Dimensional Ridge Regression for Subspace Clustering [45.651770340521786]
We propose a novel subspace clustering method for 2D data.
It directly uses 2D data as inputs such that the learning of representations benefits from inherent structures and relationships of the data.
arXiv Detail & Related papers (2020-11-03T04:52:46Z) - Dimensionality Reduction via Diffusion Map Improved with Supervised
Linear Projection [1.7513645771137178]
In this paper, we assume the data samples lie on a single underlying smooth manifold.
We define intra-class and inter-class similarities using pairwise local kernel distances.
We aim to find a linear projection to maximize the intra-class similarities and minimize the inter-class similarities simultaneously.
arXiv Detail & Related papers (2020-08-08T04:26:07Z) - Random extrapolation for primal-dual coordinate descent [61.55967255151027]
We introduce a randomly extrapolated primal-dual coordinate descent method that adapts to sparsity of the data matrix and the favorable structures of the objective function.
We show almost sure convergence of the sequence and optimal sublinear convergence rates for the primal-dual gap and objective values, in the general convex-concave case.
arXiv Detail & Related papers (2020-07-13T17:39:35Z) - Manifold Learning via Manifold Deflation [105.7418091051558]
dimensionality reduction methods provide a valuable means to visualize and interpret high-dimensional data.
Many popular methods can fail dramatically, even on simple two-dimensional Manifolds.
This paper presents an embedding method for a novel, incremental tangent space estimator that incorporates global structure as coordinates.
Empirically, we show our algorithm recovers novel and interesting embeddings on real-world and synthetic datasets.
arXiv Detail & Related papers (2020-07-07T10:04:28Z) - Two-Dimensional Semi-Nonnegative Matrix Factorization for Clustering [50.43424130281065]
We propose a new Semi-Nonnegative Matrix Factorization method for 2-dimensional (2D) data, named TS-NMF.
It overcomes the drawback of existing methods that seriously damage the spatial information of the data by converting 2D data to vectors in a preprocessing step.
arXiv Detail & Related papers (2020-05-19T05:54:14Z) - Ellipsoidal Subspace Support Vector Data Description [98.67884574313292]
We propose a novel method for transforming data into a low-dimensional space optimized for one-class classification.
We provide both linear and non-linear formulations for the proposed method.
The proposed method is noticed to converge much faster than recently proposed Subspace Support Vector Data Description.
arXiv Detail & Related papers (2020-03-20T21:31:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.