Practical applications of metric space magnitude and weighting vectors
- URL: http://arxiv.org/abs/2006.14063v2
- Date: Thu, 2 Jul 2020 20:48:51 GMT
- Title: Practical applications of metric space magnitude and weighting vectors
- Authors: Eric Bunch, Daniel Dickinson, Jeffery Kline, Glenn Fung
- Abstract summary: The magnitude of a metric space is a real number that aims to quantify the effective number of distinct points in the space.
The contribution of each point to a metric space's global magnitude, which is encoded by the em weighting vector, captures much of the underlying geometry of the original metric space.
Surprisingly, when the metric space is Euclidean, the weighting vector also serves as an effective tool for boundary detection.
- Score: 8.212024590297894
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Metric space magnitude, an active subject of research in algebraic topology,
originally arose in the context of biology, where it was used to represent the
effective number of distinct species in an environment. In a more general
setting, the magnitude of a metric space is a real number that aims to quantify
the effective number of distinct points in the space. The contribution of each
point to a metric space's global magnitude, which is encoded by the {\em
weighting vector}, captures much of the underlying geometry of the original
metric space.
Surprisingly, when the metric space is Euclidean, the weighting vector also
serves as an effective tool for boundary detection. This allows the weighting
vector to serve as the foundation of novel algorithms for classic machine
learning tasks such as classification, outlier detection and active learning.
We demonstrate, using experiments and comparisons on classic benchmark
datasets, the promise of the proposed magnitude and weighting vector-based
approaches.
Related papers
- Canonical Variates in Wasserstein Metric Space [16.668946904062032]
We employ the Wasserstein metric to measure distances between distributions, which are then used by distance-based classification algorithms.
Central to our investigation is dimension reduction within the Wasserstein metric space to enhance classification accuracy.
We introduce a novel approach grounded in the principle of maximizing Fisher's ratio, defined as the quotient of between-class variation to within-class variation.
arXiv Detail & Related papers (2024-05-24T17:59:21Z) - Embedding Trajectory for Out-of-Distribution Detection in Mathematical Reasoning [50.84938730450622]
We propose a trajectory-based method TV score, which uses trajectory volatility for OOD detection in mathematical reasoning.
Our method outperforms all traditional algorithms on GLMs under mathematical reasoning scenarios.
Our method can be extended to more applications with high-density features in output spaces, such as multiple-choice questions.
arXiv Detail & Related papers (2024-05-22T22:22:25Z) - Unleash the Potential of 3D Point Cloud Modeling with A Calibrated Local
Geometry-driven Distance Metric [62.365983810610985]
We propose a novel distance metric called Calibrated Local Geometry Distance (CLGD)
CLGD computes the difference between the underlying 3D surfaces calibrated and induced by a set of reference points.
As a generic metric, CLGD has the potential to advance 3D point cloud modeling.
arXiv Detail & Related papers (2023-06-01T11:16:20Z) - Metric Space Magnitude and Generalisation in Neural Networks [12.110483221042903]
This work is to quantify the learning process of deep neural networks through the lens of a novel topological invariant called magnitude.
We use magnitude to study the internal representations of neural networks and propose a new method for determining their generalisation capabilities.
arXiv Detail & Related papers (2023-05-09T17:04:50Z) - Transferable Deep Metric Learning for Clustering [1.2762298148425795]
Clustering in high spaces is a difficult task; the usual dimension distance metrics may no longer be appropriate under the curse of dimensionality.
We show that we can learn a metric on a labelled dataset, then apply it to cluster a different dataset.
We achieve results competitive with the state-of-the-art while using only a small number of labelled training datasets and shallow networks.
arXiv Detail & Related papers (2023-02-13T17:09:59Z) - On Hyperbolic Embeddings in 2D Object Detection [76.12912000278322]
We study whether a hyperbolic geometry better matches the underlying structure of the object classification space.
We incorporate a hyperbolic classifier in two-stage, keypoint-based, and transformer-based object detection architectures.
We observe categorical class hierarchies emerging in the structure of the classification space, resulting in lower classification errors and boosting the overall object detection performance.
arXiv Detail & Related papers (2022-03-15T16:43:40Z) - Weighting vectors for machine learning: numerical harmonic analysis
applied to boundary detection [3.8848561367220276]
We show that when the metric space is Euclidean, the weighting vector serves as an effective tool for boundary detection.
We demonstrate performance that is competitive or exceeds performance of state-of-the-art techniques on benchmark data sets.
arXiv Detail & Related papers (2021-06-01T22:14:22Z) - Quadric hypersurface intersection for manifold learning in feature space [52.83976795260532]
manifold learning technique suitable for moderately high dimension and large datasets.
The technique is learned from the training data in the form of an intersection of quadric hypersurfaces.
At test time, this manifold can be used to introduce an outlier score for arbitrary new points.
arXiv Detail & Related papers (2021-02-11T18:52:08Z) - The role of feature space in atomistic learning [62.997667081978825]
Physically-inspired descriptors play a key role in the application of machine-learning techniques to atomistic simulations.
We introduce a framework to compare different sets of descriptors, and different ways of transforming them by means of metrics and kernels.
We compare representations built in terms of n-body correlations of the atom density, quantitatively assessing the information loss associated with the use of low-order features.
arXiv Detail & Related papers (2020-09-06T14:12:09Z) - Spatial Pyramid Based Graph Reasoning for Semantic Segmentation [67.47159595239798]
We apply graph convolution into the semantic segmentation task and propose an improved Laplacian.
The graph reasoning is directly performed in the original feature space organized as a spatial pyramid.
We achieve comparable performance with advantages in computational and memory overhead.
arXiv Detail & Related papers (2020-03-23T12:28:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.