Multi-chart flows
- URL: http://arxiv.org/abs/2106.03500v1
- Date: Mon, 7 Jun 2021 10:37:06 GMT
- Title: Multi-chart flows
- Authors: Dimitris Kalatzis, Johan Ziruo Ye, Jesper Wohlert, S{\o}ren Hauberg
- Abstract summary: We present a flow-based model for concurrently learning non-trivial topologically non-trivial manifold operations.
Our model learns the local manifold topology piecewise by "gluing" it back together through a collection of learned coordinate charts.
We show better sample efficiency and competitive or superior performance against current state-of-the-art.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: We present Multi-chart flows, a flow-based model for concurrently learning
topologically non-trivial manifolds and statistical densities on them. Current
methods focus on manifolds that are topologically Euclidean, enforce strong
structural priors on the learned models or use operations that do not scale to
high dimensions. In contrast, our model learns the local manifold topology
piecewise by "gluing" it back together through a collection of learned
coordinate charts. We demonstrate the efficiency of our approach on synthetic
data of known manifolds, as well as higher dimensional manifolds of unknown
topology, where we show better sample efficiency and competitive or superior
performance against current state-of-the-art.
Related papers
- Categorical Flow Matching on Statistical Manifolds [12.646272756981672]
We introduce a flow-matching framework on the manifold of parameterized probability measures inspired by information geometry.
We develop an efficient training and sampling algorithm that overcomes numerical stability with a diffeomorphism between manifold.
We manifest that SFM can learn more complex patterns on the statistical manifold where existing models often fail due to strong prior assumptions.
arXiv Detail & Related papers (2024-05-26T05:50:39Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Generative Modeling on Manifolds Through Mixture of Riemannian Diffusion Processes [57.396578974401734]
We introduce a principled framework for building a generative diffusion process on general manifold.
Instead of following the denoising approach of previous diffusion models, we construct a diffusion process using a mixture of bridge processes.
We develop a geometric understanding of the mixture process, deriving the drift as a weighted mean of tangent directions to the data points.
arXiv Detail & Related papers (2023-10-11T06:04:40Z) - Tensor Decompositions Meet Control Theory: Learning General Mixtures of
Linear Dynamical Systems [19.47235707806519]
We give a new approach to learning mixtures of linear dynamical systems based on tensor decompositions.
Our algorithm succeeds without strong separation conditions on the components, and can be used to compete with the Bayes optimal clustering of the trajectories.
arXiv Detail & Related papers (2023-07-13T03:00:01Z) - A Heat Diffusion Perspective on Geodesic Preserving Dimensionality
Reduction [66.21060114843202]
We propose a more general heat kernel based manifold embedding method that we call heat geodesic embeddings.
Results show that our method outperforms existing state of the art in preserving ground truth manifold distances.
We also showcase our method on single cell RNA-sequencing datasets with both continuum and cluster structure.
arXiv Detail & Related papers (2023-05-30T13:58:50Z) - FMGNN: Fused Manifold Graph Neural Network [102.61136611255593]
Graph representation learning has been widely studied and demonstrated effectiveness in various graph tasks.
We propose the Fused Manifold Graph Neural Network (NN), a novel GNN architecture that embeds graphs into different Manifolds during training.
Our experiments demonstrate that NN yields superior performance over strong baselines on the benchmarks of node classification and link prediction tasks.
arXiv Detail & Related papers (2023-04-03T15:38:53Z) - Semi-Supervised Manifold Learning with Complexity Decoupled Chart Autoencoders [45.29194877564103]
This work introduces a chart autoencoder with an asymmetric encoding-decoding process that can incorporate additional semi-supervised information such as class labels.
We discuss the approximation power of such networks and derive a bound that essentially depends on the intrinsic dimension of the data manifold rather than the dimension of ambient space.
arXiv Detail & Related papers (2022-08-22T19:58:03Z) - Improving the Reconstruction of Disentangled Representation Learners via Multi-Stage Modeling [54.94763543386523]
Current autoencoder-based disentangled representation learning methods achieve disentanglement by penalizing the ( aggregate) posterior to encourage statistical independence of the latent factors.
We present a novel multi-stage modeling approach where the disentangled factors are first learned using a penalty-based disentangled representation learning method.
Then, the low-quality reconstruction is improved with another deep generative model that is trained to model the missing correlated latent variables.
arXiv Detail & Related papers (2020-10-25T18:51:15Z) - Lossless Compression of Structured Convolutional Models via Lifting [14.63152363481139]
We introduce a simple and efficient technique to detect the symmetries and compress the neural models without loss of any information.
We demonstrate through experiments that such compression can lead to significant speedups of structured convolutional models.
arXiv Detail & Related papers (2020-07-13T08:02:27Z) - Evaluating the Disentanglement of Deep Generative Models through
Manifold Topology [66.06153115971732]
We present a method for quantifying disentanglement that only uses the generative model.
We empirically evaluate several state-of-the-art models across multiple datasets.
arXiv Detail & Related papers (2020-06-05T20:54:11Z) - Flows for simultaneous manifold learning and density estimation [12.451050883955071]
manifold-learning flows (M-flows) represent datasets with a manifold structure more faithfully.
M-flows learn the data manifold and allow for better inference than standard flows in the ambient data space.
arXiv Detail & Related papers (2020-03-31T02:07:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.