Diverse Human Motion Prediction via Gumbel-Softmax Sampling from an
Auxiliary Space
- URL: http://arxiv.org/abs/2207.07351v1
- Date: Fri, 15 Jul 2022 09:03:57 GMT
- Title: Diverse Human Motion Prediction via Gumbel-Softmax Sampling from an
Auxiliary Space
- Authors: Lingwei Dang, Yongwei Nie, Chengjiang Long, Qing Zhang, Guiqing Li
- Abstract summary: Diverse human motion prediction aims at predicting multiple possible future pose sequences from a sequence of observed poses.
Previous approaches usually employ deep generative networks to model the conditional distribution of data, and then randomly sample outcomes from the distribution.
We propose a novel sampling strategy for sampling very diverse results from an imbalanced multimodal distribution.
- Score: 34.83587750498361
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Diverse human motion prediction aims at predicting multiple possible future
pose sequences from a sequence of observed poses. Previous approaches usually
employ deep generative networks to model the conditional distribution of data,
and then randomly sample outcomes from the distribution. While different
results can be obtained, they are usually the most likely ones which are not
diverse enough. Recent work explicitly learns multiple modes of the conditional
distribution via a deterministic network, which however can only cover a fixed
number of modes within a limited range. In this paper, we propose a novel
sampling strategy for sampling very diverse results from an imbalanced
multimodal distribution learned by a deep generative model. Our method works by
generating an auxiliary space and smartly making randomly sampling from the
auxiliary space equivalent to the diverse sampling from the target
distribution. We propose a simple yet effective network architecture that
implements this novel sampling strategy, which incorporates a Gumbel-Softmax
coefficient matrix sampling method and an aggressive diversity promoting hinge
loss function. Extensive experiments demonstrate that our method significantly
improves both the diversity and accuracy of the samplings compared with
previous state-of-the-art sampling approaches. Code and pre-trained models are
available at https://github.com/Droliven/diverse_sampling.
Related papers
- Theory on Score-Mismatched Diffusion Models and Zero-Shot Conditional Samplers [49.97755400231656]
We present the first performance guarantee with explicit dimensional general score-mismatched diffusion samplers.
We show that score mismatches result in an distributional bias between the target and sampling distributions, proportional to the accumulated mismatch between the target and training distributions.
This result can be directly applied to zero-shot conditional samplers for any conditional model, irrespective of measurement noise.
arXiv Detail & Related papers (2024-10-17T16:42:12Z) - Diffusion Forcing: Next-token Prediction Meets Full-Sequence Diffusion [61.03681839276652]
Diffusion Forcing is a new training paradigm where a diffusion model is trained to denoise a set of tokens with independent per-token noise levels.
We apply Diffusion Forcing to sequence generative modeling by training a causal next-token prediction model to generate one or several future tokens.
arXiv Detail & Related papers (2024-07-01T15:43:25Z) - DistPred: A Distribution-Free Probabilistic Inference Method for Regression and Forecasting [14.390842560217743]
We propose a novel approach called DistPred for regression and forecasting tasks.
We transform proper scoring rules that measure the discrepancy between the predicted distribution and the target distribution into a differentiable discrete form.
This allows the model to sample numerous samples in a single forward pass to estimate the potential distribution of the response variable.
arXiv Detail & Related papers (2024-06-17T10:33:00Z) - Improved off-policy training of diffusion samplers [93.66433483772055]
We study the problem of training diffusion models to sample from a distribution with an unnormalized density or energy function.
We benchmark several diffusion-structured inference methods, including simulation-based variational approaches and off-policy methods.
Our results shed light on the relative advantages of existing algorithms while bringing into question some claims from past work.
arXiv Detail & Related papers (2024-02-07T18:51:49Z) - StyleGenes: Discrete and Efficient Latent Distributions for GANs [149.0290830305808]
We propose a discrete latent distribution for Generative Adversarial Networks (GANs)
Instead of drawing latent vectors from a continuous prior, we sample from a finite set of learnable latents.
We take inspiration from the encoding of information in biological organisms.
arXiv Detail & Related papers (2023-04-30T23:28:46Z) - Unsupervised Sampling Promoting for Stochastic Human Trajectory
Prediction [10.717921532244613]
We propose a novel method, called BOsampler, to adaptively mine potential paths with Bayesian optimization in an unsupervised manner.
Specifically, we model the trajectory sampling as a Gaussian process and construct an acquisition function to measure the potential sampling value.
This acquisition function applies the original distribution as prior and encourages exploring paths in the long-tail region.
arXiv Detail & Related papers (2023-04-09T19:15:14Z) - Non-Probability Sampling Network for Stochastic Human Trajectory
Prediction [16.676008193894223]
Capturing multimodal natures is essential for incorporating pedestrian trajectory prediction.
We introduce the Quasi-Carlo method, ensuring uniform coverage on the sampling space, as an alternative to the conventional random sampling.
We take an additional step ahead by a learnable sampling network into the existing networks for trajectory prediction.
arXiv Detail & Related papers (2022-03-25T06:41:47Z) - Saliency Grafting: Innocuous Attribution-Guided Mixup with Calibrated
Label Mixing [104.630875328668]
Mixup scheme suggests mixing a pair of samples to create an augmented training sample.
We present a novel, yet simple Mixup-variant that captures the best of both worlds.
arXiv Detail & Related papers (2021-12-16T11:27:48Z) - Sampling from Arbitrary Functions via PSD Models [55.41644538483948]
We take a two-step approach by first modeling the probability distribution and then sampling from that model.
We show that these models can approximate a large class of densities concisely using few evaluations, and present a simple algorithm to effectively sample from these models.
arXiv Detail & Related papers (2021-10-20T12:25:22Z) - The Bures Metric for Generative Adversarial Networks [10.69910379275607]
Generative Adversarial Networks (GANs) are performant generative methods yielding high-quality samples.
We propose to match the real batch diversity to the fake batch diversity.
We observe that diversity matching reduces mode collapse substantially and has a positive effect on the sample quality.
arXiv Detail & Related papers (2020-06-16T12:04:41Z) - DLow: Diversifying Latent Flows for Diverse Human Motion Prediction [32.22704734791378]
We propose a novel sampling method, Diversifying Latent Flows (DLow), to produce a diverse set of samples from a pretrained deep generative model.
During training, DLow uses a diversity-promoting prior over samples as an objective to optimize the latent mappings to improve sample diversity.
Our experiments demonstrate that DLow outperforms state-of-the-art baseline methods in terms of sample diversity and accuracy.
arXiv Detail & Related papers (2020-03-18T17:58:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.