Product Manifold Learning
- URL: http://arxiv.org/abs/2010.09908v1
- Date: Mon, 19 Oct 2020 22:51:06 GMT
- Title: Product Manifold Learning
- Authors: Sharon Zhang, Amit Moscovich, Amit Singer
- Abstract summary: We present a new paradigm for non-linear independent component analysis called manifold factorization.
We demonstrate the potential use of our method for an important and challenging problem in structural biology.
- Score: 7.394643746798322
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We consider problems of dimensionality reduction and learning data
representations for continuous spaces with two or more independent degrees of
freedom. Such problems occur, for example, when observing shapes with several
components that move independently. Mathematically, if the parameter space of
each continuous independent motion is a manifold, then their combination is
known as a product manifold. In this paper, we present a new paradigm for
non-linear independent component analysis called manifold factorization. Our
factorization algorithm is based on spectral graph methods for manifold
learning and the separability of the Laplacian operator on product spaces.
Recovering the factors of a manifold yields meaningful lower-dimensional
representations and provides a new way to focus on particular aspects of the
data space while ignoring others. We demonstrate the potential use of our
method for an important and challenging problem in structural biology: mapping
the motions of proteins and other large molecules using cryo-electron
microscopy datasets.
Related papers
- Improving embedding of graphs with missing data by soft manifolds [51.425411400683565]
The reliability of graph embeddings depends on how much the geometry of the continuous space matches the graph structure.
We introduce a new class of manifold, named soft manifold, that can solve this situation.
Using soft manifold for graph embedding, we can provide continuous spaces to pursue any task in data analysis over complex datasets.
arXiv Detail & Related papers (2023-11-29T12:48:33Z) - Improving Heterogeneous Graph Learning with Weighted Mixed-Curvature
Product Manifold [4.640835690336652]
In graph representation learning, the complex geometric structure of the input graph, e.g. hidden relations among nodes, is well captured in embedding space.
Standard Euclidean embedding spaces have a limited capacity in representing graphs of varying structures.
A promising candidate for the faithful embedding of data with varying structure is product manifold embedding spaces.
arXiv Detail & Related papers (2023-07-10T12:20:50Z) - MUDiff: Unified Diffusion for Complete Molecule Generation [104.7021929437504]
We present a new model for generating a comprehensive representation of molecules, including atom features, 2D discrete molecule structures, and 3D continuous molecule coordinates.
We propose a novel graph transformer architecture to denoise the diffusion process.
Our model is a promising approach for designing stable and diverse molecules and can be applied to a wide range of tasks in molecular modeling.
arXiv Detail & Related papers (2023-04-28T04:25:57Z) - Dynamic Latent Separation for Deep Learning [67.62190501599176]
A core problem in machine learning is to learn expressive latent variables for model prediction on complex data.
Here, we develop an approach that improves expressiveness, provides partial interpretation, and is not restricted to specific applications.
arXiv Detail & Related papers (2022-10-07T17:56:53Z) - Shape And Structure Preserving Differential Privacy [70.08490462870144]
We show how the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
We also show how using the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
arXiv Detail & Related papers (2022-09-21T18:14:38Z) - Tensor-based Multi-view Spectral Clustering via Shared Latent Space [14.470859959783995]
Multi-view Spectral Clustering (MvSC) attracts increasing attention due to diverse data sources.
New method for MvSC is proposed via a shared latent space from the Restricted Kernel Machine framework.
arXiv Detail & Related papers (2022-07-23T17:30:54Z) - The role of feature space in atomistic learning [62.997667081978825]
Physically-inspired descriptors play a key role in the application of machine-learning techniques to atomistic simulations.
We introduce a framework to compare different sets of descriptors, and different ways of transforming them by means of metrics and kernels.
We compare representations built in terms of n-body correlations of the atom density, quantitatively assessing the information loss associated with the use of low-order features.
arXiv Detail & Related papers (2020-09-06T14:12:09Z) - Disentangling by Subspace Diffusion [72.1895236605335]
We show that fully unsupervised factorization of a data manifold is possible if the true metric of the manifold is known.
Our work reduces the question of whether unsupervised metric learning is possible, providing a unifying insight into the geometric nature of representation learning.
arXiv Detail & Related papers (2020-06-23T13:33:19Z) - Sample complexity and effective dimension for regression on manifolds [13.774258153124205]
We consider the theory of regression on a manifold using kernel reproducing Hilbert space methods.
We show that certain spaces of smooth functions on a manifold are effectively finite-dimensional, with a complexity that scales according to the manifold dimension.
arXiv Detail & Related papers (2020-06-13T14:09:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.