Learning Locally Adaptive Metrics that Enhance Structural Representation with $\texttt{LAMINAR}$
- URL: http://arxiv.org/abs/2411.08557v1
- Date: Wed, 13 Nov 2024 12:13:15 GMT
- Title: Learning Locally Adaptive Metrics that Enhance Structural Representation with $\texttt{LAMINAR}$
- Authors: Christian Kleiber, William H. Oliver, Tobias Buck,
- Abstract summary: $textttLAMINAR$ is an unsupervised machine learning pipeline designed to enhance the representation of structure within data.
It produces a locally-adaptive-metric that produces structurally-informative density-based distances.
We demonstrate the utility of $textttLAMINAR$ by comparing its output to the Euclidean metric for structured data sets.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present $\texttt{LAMINAR}$, a novel unsupervised machine learning pipeline designed to enhance the representation of structure within data via producing a more-informative distance metric. Analysis methods in the physical sciences often rely on standard metrics to define geometric relationships in data, which may fail to capture the underlying structure of complex data sets. $\texttt{LAMINAR}$ addresses this by using a continuous-normalising-flow and inverse-transform-sampling to define a Riemannian manifold in the data space without the need for the user to specify a metric over the data a-priori. The result is a locally-adaptive-metric that produces structurally-informative density-based distances. We demonstrate the utility of $\texttt{LAMINAR}$ by comparing its output to the Euclidean metric for structured data sets.
Related papers
- Adaptive Locally Linear Embedding [10.331256742632835]
A novel approach, Adaptive locally linear embedding(ALLE), is introduced to address this limitation.
Experimental results demonstrate that ALLE significantly improves the alignment between neighborhoods in the input and feature spaces.
This approach advances manifold learning by tailoring distance metrics to the underlying data, providing a robust solution for capturing intricate relationships in high-dimensional datasets.
arXiv Detail & Related papers (2025-04-09T12:40:13Z) - Score-based pullback Riemannian geometry [10.649159213723106]
We propose a framework for data-driven Riemannian geometry that is scalable in both geometry and learning.
We produce high-quality geodesics through the data support and reliably estimates the intrinsic dimension of the data manifold.
Our framework can naturally be used with anisotropic normalizing flows by adopting isometry regularization during training.
arXiv Detail & Related papers (2024-10-02T18:52:12Z) - Induced Covariance for Causal Discovery in Linear Sparse Structures [55.2480439325792]
Causal models seek to unravel the cause-effect relationships among variables from observed data.
This paper introduces a novel causal discovery algorithm designed for settings in which variables exhibit linearly sparse relationships.
arXiv Detail & Related papers (2024-10-02T04:01:38Z) - Taming CLIP for Fine-grained and Structured Visual Understanding of Museum Exhibits [59.66134971408414]
We aim to adapt CLIP for fine-grained and structured understanding of museum exhibits.
Our dataset is the first of its kind in the public domain.
The proposed method (MUZE) learns to map CLIP's image embeddings to the tabular structure by means of a proposed transformer-based parsing network (parseNet)
arXiv Detail & Related papers (2024-09-03T08:13:06Z) - IsUMap: Manifold Learning and Data Visualization leveraging Vietoris-Rips filtrations [0.08796261172196743]
We present a systematic and detailed construction of a metric representation for locally distorted metric spaces.
Our approach addresses limitations in existing methods by accommodating non-uniform data distributions and intricate local geometries.
arXiv Detail & Related papers (2024-07-25T07:46:30Z) - (Deep) Generative Geodesics [57.635187092922976]
We introduce a newian metric to assess the similarity between any two data points.
Our metric leads to the conceptual definition of generative distances and generative geodesics.
Their approximations are proven to converge to their true values under mild conditions.
arXiv Detail & Related papers (2024-07-15T21:14:02Z) - Order-based Structure Learning with Normalizing Flows [7.972479571606131]
Estimating causal structure of observational data is a challenging search problem that scales super-exponentially with graph size.
Existing methods use continuous relaxations to make this problem computationally tractable but often restrict the data-generating process to additive noise models (ANMs)
We present Order-based Structure Learning with Normalizing Flows (OSLow), a framework that relaxes these assumptions using autoregressive normalizing flows.
arXiv Detail & Related papers (2023-08-14T22:17:33Z) - StructGPT: A General Framework for Large Language Model to Reason over
Structured Data [117.13986738340027]
We develop an emphIterative Reading-then-Reasoning(IRR) approach for solving question answering tasks based on structured data.
Our approach can significantly boost the performance of ChatGPT and achieve comparable performance against the full-data supervised-tuning baselines.
arXiv Detail & Related papers (2023-05-16T17:45:23Z) - GenURL: A General Framework for Unsupervised Representation Learning [58.59752389815001]
Unsupervised representation learning (URL) learns compact embeddings of high-dimensional data without supervision.
We propose a unified similarity-based URL framework, GenURL, which can smoothly adapt to various URL tasks.
Experiments demonstrate that GenURL achieves consistent state-of-the-art performance in self-supervised visual learning, unsupervised knowledge distillation (KD), graph embeddings (GE), and dimension reduction.
arXiv Detail & Related papers (2021-10-27T16:24:39Z) - Capturing Structural Locality in Non-parametric Language Models [85.94669097485992]
We propose a simple yet effective approach for adding locality information into non-parametric language models.
Experiments on two different domains, Java source code and Wikipedia text, demonstrate that locality features improve model efficacy.
arXiv Detail & Related papers (2021-10-06T15:53:38Z) - LOCA: LOcal Conformal Autoencoder for standardized data coordinates [6.608924227377152]
We present a method for learning an embedding in $mathbbRd$ that is isometric to the latent variables of the manifold.
Our embedding is obtained using a LOcal Conformal Autoencoder (LOCA), an algorithm that constructs an embedding to rectify deformations.
We also apply LOCA to single-site Wi-Fi localization data, and to $3$-dimensional curved surface estimation.
arXiv Detail & Related papers (2020-04-15T17:49:37Z) - Learning Flat Latent Manifolds with VAEs [16.725880610265378]
We propose an extension to the framework of variational auto-encoders, where the Euclidean metric is a proxy for the similarity between data points.
We replace the compact prior typically used in variational auto-encoders with a recently presented, more expressive hierarchical one.
We evaluate our method on a range of data-sets, including a video-tracking benchmark.
arXiv Detail & Related papers (2020-02-12T09:54:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.