Factorized Deep Generative Models for Trajectory Generation with
Spatiotemporal-Validity Constraints
- URL: http://arxiv.org/abs/2009.09333v1
- Date: Sun, 20 Sep 2020 02:06:36 GMT
- Title: Factorized Deep Generative Models for Trajectory Generation with
Spatiotemporal-Validity Constraints
- Authors: Liming Zhang, Liang Zhao, Dieter Pfoser
- Abstract summary: Deep generative models for trajectory data can learn expressively explanatory models for sophisticated latent patterns.
We first propose novel deep generative models factorizing time-variant and time-invariant latent variables.
We then develop new inference strategies based on variational inference and constrained optimization to thetemporal validity.
- Score: 10.960924101404498
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Trajectory data generation is an important domain that characterizes the
generative process of mobility data. Traditional methods heavily rely on
predefined heuristics and distributions and are weak in learning unknown
mechanisms. Inspired by the success of deep generative neural networks for
images and texts, a fast-developing research topic is deep generative models
for trajectory data which can learn expressively explanatory models for
sophisticated latent patterns. This is a nascent yet promising domain for many
applications. We first propose novel deep generative models factorizing
time-variant and time-invariant latent variables that characterize global and
local semantics, respectively. We then develop new inference strategies based
on variational inference and constrained optimization to encapsulate the
spatiotemporal validity. New deep neural network architectures have been
developed to implement the inference and generation models with
newly-generalized latent variable priors. The proposed methods achieved
significant improvements in quantitative and qualitative evaluations in
extensive experiments.
Related papers
- Heuristically Adaptive Diffusion-Model Evolutionary Strategy [1.8299322342860518]
Diffusion Models represent a significant advancement in generative modeling.
Our research reveals a fundamental connection between diffusion models and evolutionary algorithms.
Our framework marks a major algorithmic transition, offering increased flexibility, precision, and control in evolutionary optimization processes.
arXiv Detail & Related papers (2024-11-20T16:06:28Z) - Neural Residual Diffusion Models for Deep Scalable Vision Generation [17.931568104324985]
We propose a unified and massively scalable Neural Residual Diffusion Models framework (Neural-RDM)
The proposed neural residual models obtain state-of-the-art scores on image's and video's generative benchmarks.
arXiv Detail & Related papers (2024-06-19T04:57:18Z) - State-Space Modeling in Long Sequence Processing: A Survey on Recurrence in the Transformer Era [59.279784235147254]
This survey provides an in-depth summary of the latest approaches that are based on recurrent models for sequential data processing.
The emerging picture suggests that there is room for thinking of novel routes, constituted by learning algorithms which depart from the standard Backpropagation Through Time.
arXiv Detail & Related papers (2024-06-13T12:51:22Z) - On the Resurgence of Recurrent Models for Long Sequences -- Survey and
Research Opportunities in the Transformer Era [59.279784235147254]
This survey is aimed at providing an overview of these trends framed under the unifying umbrella of Recurrence.
It emphasizes novel research opportunities that become prominent when abandoning the idea of processing long sequences.
arXiv Detail & Related papers (2024-02-12T23:55:55Z) - Visual Prompting Upgrades Neural Network Sparsification: A Data-Model Perspective [64.04617968947697]
We introduce a novel data-model co-design perspective: to promote superior weight sparsity.
Specifically, customized Visual Prompts are mounted to upgrade neural Network sparsification in our proposed VPNs framework.
arXiv Detail & Related papers (2023-12-03T13:50:24Z) - Generative Learning of Continuous Data by Tensor Networks [45.49160369119449]
We introduce a new family of tensor network generative models for continuous data.
We benchmark the performance of this model on several synthetic and real-world datasets.
Our methods give important theoretical and empirical evidence of the efficacy of quantum-inspired methods for the rapidly growing field of generative learning.
arXiv Detail & Related papers (2023-10-31T14:37:37Z) - Learning Generative Models for Lumped Rainfall-Runoff Modeling [3.69758875412828]
This study presents a novel generative modeling approach to rainfall-runoff modeling, focusing on the synthesis of realistic daily catchment runoff time series.
Unlike traditional process-based lumped hydrologic models, our approach uses a small number of latent variables to characterize runoff generation processes.
In this study, we trained the generative models using neural networks on data from over 3,000 global catchments and achieved prediction accuracies comparable to current deep learning models.
arXiv Detail & Related papers (2023-09-18T16:07:41Z) - Deep networks for system identification: a Survey [56.34005280792013]
System identification learns mathematical descriptions of dynamic systems from input-output data.
Main aim of the identified model is to predict new data from previous observations.
We discuss architectures commonly adopted in the literature, like feedforward, convolutional, and recurrent networks.
arXiv Detail & Related papers (2023-01-30T12:38:31Z) - A Survey on Generative Diffusion Model [75.93774014861978]
Diffusion models are an emerging class of deep generative models.
They have certain limitations, including a time-consuming iterative generation process and confinement to high-dimensional Euclidean space.
This survey presents a plethora of advanced techniques aimed at enhancing diffusion models.
arXiv Detail & Related papers (2022-09-06T16:56:21Z) - Generative Deep Learning Techniques for Password Generation [0.5249805590164902]
We study a broad collection of deep learning and probabilistic based models in the light of password guessing.
We provide novel generative deep-learning models in terms of variational autoencoders exhibiting state-of-art sampling performance.
We perform a thorough empirical analysis in a unified controlled framework over well-known datasets.
arXiv Detail & Related papers (2020-12-10T14:11:45Z) - Model-Based Robust Deep Learning: Generalizing to Natural,
Out-of-Distribution Data [104.69689574851724]
We propose a paradigm shift from perturbation-based adversarial robustness toward model-based robust deep learning.
Our objective is to provide general training algorithms that can be used to train deep neural networks to be robust against natural variation in data.
arXiv Detail & Related papers (2020-05-20T13:46:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.