Analysis of ODE2VAE with Examples
- URL: http://arxiv.org/abs/2108.04899v1
- Date: Tue, 10 Aug 2021 20:12:26 GMT
- Title: Analysis of ODE2VAE with Examples
- Authors: Batuhan Koyuncu
- Abstract summary: Ordinary Differential Equation Variational Auto-Encoder (ODE2VAE) is a deep latent variable model.
We show that the model is able to learn meaningful latent representations to an extent without any supervision.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep generative models aim to learn underlying distributions that generate
the observed data. Given the fact that the generative distribution may be
complex and intractable, deep latent variable models use probabilistic
frameworks to learn more expressive joint probability distributions over the
data and their low-dimensional hidden variables. Learning complex probability
distributions over sequential data without any supervision is a difficult task
for deep generative models. Ordinary Differential Equation Variational
Auto-Encoder (ODE2VAE) is a deep latent variable model that aims to learn
complex distributions over high-dimensional sequential data and their
low-dimensional representations. ODE2VAE infers continuous latent dynamics of
the high-dimensional input in a low-dimensional hierarchical latent space. The
hierarchical organization of the continuous latent space embeds a
physics-guided inductive bias in the model. In this paper, we analyze the
latent representations inferred by the ODE2VAE model over three different
physical motion datasets: bouncing balls, projectile motion, and simple
pendulum. Through our experiments, we explore the effects of the physics-guided
inductive bias of the ODE2VAE model over the learned dynamical latent
representations. We show that the model is able to learn meaningful latent
representations to an extent without any supervision.
Related papers
- Identifiable Representation and Model Learning for Latent Dynamic Systems [0.0]
We study the problem of identifiable representation and model learning for latent dynamic systems.
We prove that, for linear or affine nonlinear latent dynamic systems, it is possible to identify the representations up to scaling and determine the models up to some simple transformations.
arXiv Detail & Related papers (2024-10-23T13:55:42Z) - Learning Divergence Fields for Shift-Robust Graph Representations [73.11818515795761]
In this work, we propose a geometric diffusion model with learnable divergence fields for the challenging problem with interdependent data.
We derive a new learning objective through causal inference, which can guide the model to learn generalizable patterns of interdependence that are insensitive across domains.
arXiv Detail & Related papers (2024-06-07T14:29:21Z) - Recurrent Deep Kernel Learning of Dynamical Systems [0.5825410941577593]
Digital twins require computationally-efficient reduced-order models (ROMs) that can accurately describe complex dynamics of physical assets.
We propose a data-driven, non-intrusive deep kernel learning (SVDKL) method to discover low-dimensional latent spaces from data.
Results show that our framework is capable of (i) denoising and reconstructing measurements, (ii) learning compact representations of system states, (iii) predicting system evolution in low-dimensional latent spaces, and (iv) modeling uncertainties.
arXiv Detail & Related papers (2024-05-30T07:49:02Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Synthetic location trajectory generation using categorical diffusion
models [50.809683239937584]
Diffusion models (DPMs) have rapidly evolved to be one of the predominant generative models for the simulation of synthetic data.
We propose using DPMs for the generation of synthetic individual location trajectories (ILTs) which are sequences of variables representing physical locations visited by individuals.
arXiv Detail & Related papers (2024-02-19T15:57:39Z) - Latent Dynamical Implicit Diffusion Processes [0.0]
We propose a novel latent variable model named latent dynamical implicit diffusion processes (LDIDPs)
LDIDPs utilize implicit diffusion processes to sample from dynamical latent processes and generate sequential observation samples accordingly.
We demonstrate that LDIDPs can accurately learn the dynamics over latent dimensions.
arXiv Detail & Related papers (2023-06-12T12:43:27Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - DIFFormer: Scalable (Graph) Transformers Induced by Energy Constrained
Diffusion [66.21290235237808]
We introduce an energy constrained diffusion model which encodes a batch of instances from a dataset into evolutionary states.
We provide rigorous theory that implies closed-form optimal estimates for the pairwise diffusion strength among arbitrary instance pairs.
Experiments highlight the wide applicability of our model as a general-purpose encoder backbone with superior performance in various tasks.
arXiv Detail & Related papers (2023-01-23T15:18:54Z) - Dynamic Latent Separation for Deep Learning [67.62190501599176]
A core problem in machine learning is to learn expressive latent variables for model prediction on complex data.
Here, we develop an approach that improves expressiveness, provides partial interpretation, and is not restricted to specific applications.
arXiv Detail & Related papers (2022-10-07T17:56:53Z) - Learning and Inference in Sparse Coding Models with Langevin Dynamics [3.0600309122672726]
We describe a system capable of inference and learning in a probabilistic latent variable model.
We demonstrate this idea for a sparse coding model by deriving a continuous-time equation for inferring its latent variables via Langevin dynamics.
We show that Langevin dynamics lead to an efficient procedure for sampling from the posterior distribution in the 'L0 sparse' regime, where latent variables are encouraged to be set to zero as opposed to having a small L1 norm.
arXiv Detail & Related papers (2022-04-23T23:16:47Z) - Dynamical Deep Generative Latent Modeling of 3D Skeletal Motion [15.359134407309726]
Our model decomposes highly correlated skeleton data into a set of few spatial basis of switching temporal processes.
This results in a dynamical deep generative latent model that parses the meaningful intrinsic states in the dynamics of 3D pose data.
arXiv Detail & Related papers (2021-06-18T23:58:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.