Generative modeling via tensor train sketching
- URL: http://arxiv.org/abs/2202.11788v6
- Date: Fri, 23 Jun 2023 21:25:28 GMT
- Title: Generative modeling via tensor train sketching
- Authors: YH. Hur, J. G. Hoskins, M. Lindsey, E.M. Stoudenmire, Y. Khoo
- Abstract summary: We introduce a sketching algorithm for constructing a tensor train representation of a probability density from its samples.
We prove that the tensor cores can be recovered with a sample complexity that scales logarithmically in the dimensionality.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we introduce a sketching algorithm for constructing a tensor
train representation of a probability density from its samples. Our method
deviates from the standard recursive SVD-based procedure for constructing a
tensor train. Instead, we formulate and solve a sequence of small linear
systems for the individual tensor train cores. This approach can avoid the
curse of dimensionality that threatens both the algorithmic and sample
complexities of the recovery problem. Specifically, for Markov models under
natural conditions, we prove that the tensor cores can be recovered with a
sample complexity that scales logarithmically in the dimensionality. Finally,
we illustrate the performance of the method with several numerical experiments.
Related papers
- On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - TERM Model: Tensor Ring Mixture Model for Density Estimation [48.622060998018206]
In this paper, we take tensor ring decomposition for density estimator, which significantly reduces the number of permutation candidates.
A mixture model that incorporates multiple permutation candidates with adaptive weights is further designed, resulting in increased expressive flexibility.
This approach acknowledges that suboptimal permutations can offer distinctive information besides that of optimal permutations.
arXiv Detail & Related papers (2023-12-13T11:39:56Z) - Handling The Non-Smooth Challenge in Tensor SVD: A Multi-Objective Tensor Recovery Framework [15.16222081389267]
We introduce a novel tensor recovery model with a learnable tensor nuclear norm to address the challenge of non-smooth changes in tensor data.
We develop a new optimization algorithm named the Alternating Proximal Multiplier Method (APMM) to iteratively solve the proposed tensor completion model.
In addition, we propose a multi-objective tensor recovery framework based on APMM to efficiently explore the correlations of tensor data across its various dimensions.
arXiv Detail & Related papers (2023-11-23T12:16:33Z) - Predicting Ordinary Differential Equations with Transformers [65.07437364102931]
We develop a transformer-based sequence-to-sequence model that recovers scalar ordinary differential equations (ODEs) in symbolic form from irregularly sampled and noisy observations of a single solution trajectory.
Our method is efficiently scalable: after one-time pretraining on a large set of ODEs, we can infer the governing law of a new observed solution in a few forward passes of the model.
arXiv Detail & Related papers (2023-07-24T08:46:12Z) - A Variational Perspective on Solving Inverse Problems with Diffusion
Models [101.831766524264]
Inverse tasks can be formulated as inferring a posterior distribution over data.
This is however challenging in diffusion models since the nonlinear and iterative nature of the diffusion process renders the posterior intractable.
We propose a variational approach that by design seeks to approximate the true posterior distribution.
arXiv Detail & Related papers (2023-05-07T23:00:47Z) - Generative Modeling via Hierarchical Tensor Sketching [12.005736675688917]
We propose a hierarchical tensor-network approach for approximating high-dimensional probability density via empirical distribution.
The complexity of the resulting algorithm scales linearly in the dimension of the high-dimensional density.
arXiv Detail & Related papers (2023-04-11T15:55:13Z) - Learning Equivariant Energy Based Models with Equivariant Stein
Variational Gradient Descent [80.73580820014242]
We focus on the problem of efficient sampling and learning of probability densities by incorporating symmetries in probabilistic models.
We first introduce Equivariant Stein Variational Gradient Descent algorithm -- an equivariant sampling method based on Stein's identity for sampling from densities with symmetries.
We propose new ways of improving and scaling up training of energy based models.
arXiv Detail & Related papers (2021-06-15T01:35:17Z) - Tensor decomposition for learning Gaussian mixtures from moments [6.576993289263191]
In data processing and machine learning, an important challenge is to recover and exploit models that can represent accurately the data.
We investigate symmetric tensor decomposition methods for tackling this problem, where the tensor is built from empirical moments of the data distribution.
arXiv Detail & Related papers (2021-06-01T15:11:08Z) - Alternating linear scheme in a Bayesian framework for low-rank tensor
approximation [5.833272638548154]
We find a low-rank representation for a given tensor by solving a Bayesian inference problem.
We present an algorithm that performs the unscented transform in tensor train format.
arXiv Detail & Related papers (2020-12-21T10:15:30Z) - A Solution for Large Scale Nonlinear Regression with High Rank and
Degree at Constant Memory Complexity via Latent Tensor Reconstruction [0.0]
This paper proposes a novel method for learning highly nonlinear, multivariate functions from examples.
Our method takes advantage of the property that continuous functions can be approximated by bys, which in turn are representable by tensors.
For learning the models, we present an efficient-based algorithm that can be implemented in linear time.
arXiv Detail & Related papers (2020-05-04T14:49:14Z) - Multi-View Spectral Clustering Tailored Tensor Low-Rank Representation [105.33409035876691]
This paper explores the problem of multi-view spectral clustering (MVSC) based on tensor low-rank modeling.
We design a novel structured tensor low-rank norm tailored to MVSC.
We show that the proposed method outperforms state-of-the-art methods to a significant extent.
arXiv Detail & Related papers (2020-04-30T11:52:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.