DimenFix: A novel meta-dimensionality reduction method for feature
preservation
- URL: http://arxiv.org/abs/2211.16752v1
- Date: Wed, 30 Nov 2022 05:35:22 GMT
- Title: DimenFix: A novel meta-dimensionality reduction method for feature
preservation
- Authors: Qiaodan Luo, Leonardo Christino, Fernando V Paulovich and Evangelos
Milios
- Abstract summary: We propose a novel meta-method, DimenFix, which can be operated upon any base dimensionality reduction method that involves a gradient-descent-like process.
By allowing users to define the importance of different features, which is considered in dimensionality reduction, DimenFix creates new possibilities to visualize and understand a given dataset.
- Score: 64.0476282000118
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Dimensionality reduction has become an important research topic as demand for
interpreting high-dimensional datasets has been increasing rapidly in recent
years. There have been many dimensionality reduction methods with good
performance in preserving the overall relationship among data points when
mapping them to a lower-dimensional space. However, these existing methods fail
to incorporate the difference in importance among features.
To address this problem, we propose a novel meta-method, DimenFix, which can
be operated upon any base dimensionality reduction method that involves a
gradient-descent-like process. By allowing users to define the importance of
different features, which is considered in dimensionality reduction, DimenFix
creates new possibilities to visualize and understand a given dataset.
Meanwhile, DimenFix does not increase the time cost or reduce the quality of
dimensionality reduction with respect to the base dimensionality reduction
used.
Related papers
- Dimension reduction via score ratio matching [0.9012198585960441]
We propose a framework, derived from score-matching, to extend gradient-based dimension reduction to problems where gradients are unavailable.
We show that our approach outperforms standard score-matching for problems with low-dimensional structure.
arXiv Detail & Related papers (2024-10-25T22:21:03Z) - Golden Ratio-Based Sufficient Dimension Reduction [6.184279198087624]
We propose a neural network based sufficient dimension reduction method.
It identifies the structural dimension effectively and estimates the central space well.
It takes advantages of approximation capabilities of neural networks for functions in Barron classes and leads to reduced computation cost.
arXiv Detail & Related papers (2024-10-25T04:15:15Z) - Interpretable Linear Dimensionality Reduction based on Bias-Variance
Analysis [45.3190496371625]
We propose a principled dimensionality reduction approach that maintains the interpretability of the resulting features.
In this way, all features are considered, the dimensionality is reduced and the interpretability is preserved.
arXiv Detail & Related papers (2023-03-26T14:30:38Z) - An Experimental Study of Dimension Reduction Methods on Machine Learning
Algorithms with Applications to Psychometrics [77.34726150561087]
We show that dimension reduction can decrease, increase, or provide the same accuracy as no reduction of variables.
Our tentative results find that dimension reduction tends to lead to better performance when used for classification tasks.
arXiv Detail & Related papers (2022-10-19T22:07:13Z) - Laplacian-based Cluster-Contractive t-SNE for High Dimensional Data
Visualization [20.43471678277403]
We propose LaptSNE, a new graph-based dimensionality reduction method based on t-SNE.
Specifically, LaptSNE leverages the eigenvalue information of the graph Laplacian to shrink the potential clusters in the low-dimensional embedding.
We show how to calculate the gradient analytically, which may be of broad interest when considering optimization with Laplacian-composited objective.
arXiv Detail & Related papers (2022-07-25T14:10:24Z) - Revisiting Point Cloud Simplification: A Learnable Feature Preserving
Approach [57.67932970472768]
Mesh and Point Cloud simplification methods aim to reduce the complexity of 3D models while retaining visual quality and relevant salient features.
We propose a fast point cloud simplification method by learning to sample salient points.
The proposed method relies on a graph neural network architecture trained to select an arbitrary, user-defined, number of points from the input space and to re-arrange their positions so as to minimize the visual perception error.
arXiv Detail & Related papers (2021-09-30T10:23:55Z) - A Subspace-based Approach for Dimensionality Reduction and Important
Variable Selection [0.0]
This research proposes a new method that produces subspaces, reduced-dimensional physical spaces, based on a randomized search.
When applied to high-dimensional data collected from a composite metal development process, the proposed method shows its superiority in prediction and important variable selection.
arXiv Detail & Related papers (2021-06-03T04:10:34Z) - A Local Similarity-Preserving Framework for Nonlinear Dimensionality
Reduction with Neural Networks [56.068488417457935]
We propose a novel local nonlinear approach named Vec2vec for general purpose dimensionality reduction.
To train the neural network, we build the neighborhood similarity graph of a matrix and define the context of data points.
Experiments of data classification and clustering on eight real datasets show that Vec2vec is better than several classical dimensionality reduction methods in the statistical hypothesis test.
arXiv Detail & Related papers (2021-03-10T23:10:47Z) - Multi-point dimensionality reduction to improve projection layout
reliability [77.34726150561087]
In ordinary Dimensionality Reduction (DR), each data instance in an m-dimensional space (original space) is mapped to one point in a d-dimensional space (visual space)
Our solution, named Red Gray Plus, is built upon and extends a combination of ordinary DR and graph drawing techniques.
arXiv Detail & Related papers (2021-01-15T17:17:02Z) - Manifold Learning via Manifold Deflation [105.7418091051558]
dimensionality reduction methods provide a valuable means to visualize and interpret high-dimensional data.
Many popular methods can fail dramatically, even on simple two-dimensional Manifolds.
This paper presents an embedding method for a novel, incremental tangent space estimator that incorporates global structure as coordinates.
Empirically, we show our algorithm recovers novel and interesting embeddings on real-world and synthetic datasets.
arXiv Detail & Related papers (2020-07-07T10:04:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.