Quantifying Manifolds: Do the manifolds learned by Generative
Adversarial Networks converge to the real data manifold
- URL: http://arxiv.org/abs/2403.05033v1
- Date: Fri, 8 Mar 2024 04:23:50 GMT
- Title: Quantifying Manifolds: Do the manifolds learned by Generative
Adversarial Networks converge to the real data manifold
- Authors: Anupam Chaudhuri, Anj Simmons, Mohamed Abdelrazek
- Abstract summary: We study the dimensions and topological features of the intrinsic manifold learned by the ML model, how these metrics change as we continue to train the model, and whether these metrics convergence over the course of training to the metrics of the real data manifold.
- Score: 1.9186271764628762
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper presents our experiments to quantify the manifolds learned by ML
models (in our experiment, we use a GAN model) as they train. We compare the
manifolds learned at each epoch to the real manifolds representing the real
data. To quantify a manifold, we study the intrinsic dimensions and topological
features of the manifold learned by the ML model, how these metrics change as
we continue to train the model, and whether these metrics convergence over the
course of training to the metrics of the real data manifold.
Related papers
- Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Generative Modeling on Manifolds Through Mixture of Riemannian Diffusion Processes [57.396578974401734]
We introduce a principled framework for building a generative diffusion process on general manifold.
Instead of following the denoising approach of previous diffusion models, we construct a diffusion process using a mixture of bridge processes.
We develop a geometric understanding of the mixture process, deriving the drift as a weighted mean of tangent directions to the data points.
arXiv Detail & Related papers (2023-10-11T06:04:40Z) - Manifold Diffusion Fields [11.4726574705951]
We present an approach that unlocks learning of diffusion models of data in non-Euclidean geometries.
We define an intrinsic coordinate system on the manifold via the eigen-functions of the Laplace-Beltrami Operator.
We show that MDF can capture distributions of such functions with better diversity and fidelity than previous approaches.
arXiv Detail & Related papers (2023-05-24T21:42:45Z) - Convolutional Filtering on Sampled Manifolds [122.06927400759021]
We show that convolutional filtering on a sampled manifold converges to continuous manifold filtering.
Our findings are further demonstrated empirically on a problem of navigation control.
arXiv Detail & Related papers (2022-11-20T19:09:50Z) - The Manifold Scattering Transform for High-Dimensional Point Cloud Data [16.500568323161563]
We present practical schemes for implementing the manifold scattering transform to datasets arising in naturalistic systems.
We show that our methods are effective for signal classification and manifold classification tasks.
arXiv Detail & Related papers (2022-06-21T02:15:00Z) - Riemannian Score-Based Generative Modeling [56.20669989459281]
We introduce score-based generative models (SGMs) demonstrating remarkable empirical performance.
Current SGMs make the underlying assumption that the data is supported on a Euclidean manifold with flat geometry.
This prevents the use of these models for applications in robotics, geoscience or protein modeling.
arXiv Detail & Related papers (2022-02-06T11:57:39Z) - Equivariant Manifold Flows [48.21296508399746]
We lay the theoretical foundations for learning symmetry-invariant distributions on arbitrary manifold via equivariant manifold flows.
We demonstrate the utility of our approach by using it to learn gauge invariant densities over $SU(n)$ in the context of quantum field theory.
arXiv Detail & Related papers (2021-07-19T03:04:44Z) - Multi-chart flows [0.0]
We present a flow-based model for concurrently learning non-trivial topologically non-trivial manifold operations.
Our model learns the local manifold topology piecewise by "gluing" it back together through a collection of learned coordinate charts.
We show better sample efficiency and competitive or superior performance against current state-of-the-art.
arXiv Detail & Related papers (2021-06-07T10:37:06Z) - Flow-based Generative Models for Learning Manifold to Manifold Mappings [39.60406116984869]
We introduce three kinds of invertible layers for manifold-valued data, which are analogous to their functionality in flow-based generative models.
We show promising results where we can reliably and accurately reconstruct brain images of a field of orientation distribution functions.
arXiv Detail & Related papers (2020-12-18T02:19:18Z) - Evaluating the Disentanglement of Deep Generative Models through
Manifold Topology [66.06153115971732]
We present a method for quantifying disentanglement that only uses the generative model.
We empirically evaluate several state-of-the-art models across multiple datasets.
arXiv Detail & Related papers (2020-06-05T20:54:11Z) - Flows for simultaneous manifold learning and density estimation [12.451050883955071]
manifold-learning flows (M-flows) represent datasets with a manifold structure more faithfully.
M-flows learn the data manifold and allow for better inference than standard flows in the ambient data space.
arXiv Detail & Related papers (2020-03-31T02:07:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.