Traversing Time with Multi-Resolution Gaussian Process State-Space
Models
- URL: http://arxiv.org/abs/2112.03230v1
- Date: Mon, 6 Dec 2021 18:39:27 GMT
- Title: Traversing Time with Multi-Resolution Gaussian Process State-Space
Models
- Authors: Krista Longi, Jakob Lindinger, Olaf Duennbier, Melih Kandemir, Arto
Klami, Barbara Rakitsch
- Abstract summary: We propose a novel Gaussian process state-space architecture composed of multiple components, each trained on a different resolution, to model effects on different timescales.
We benchmark our novel method on semi-synthetic data and on an engine modeling task.
In both experiments, our approach compares favorably against its state-of-the-art alternatives that operate on a single time-scale only.
- Score: 17.42262122708566
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Gaussian Process state-space models capture complex temporal dependencies in
a principled manner by placing a Gaussian Process prior on the transition
function. These models have a natural interpretation as discretized stochastic
differential equations, but inference for long sequences with fast and slow
transitions is difficult. Fast transitions need tight discretizations whereas
slow transitions require backpropagating the gradients over long
subtrajectories. We propose a novel Gaussian process state-space architecture
composed of multiple components, each trained on a different resolution, to
model effects on different timescales. The combined model allows traversing
time on adaptive scales, providing efficient inference for arbitrarily long
sequences with complex dynamics. We benchmark our novel method on
semi-synthetic data and on an engine modeling task. In both experiments, our
approach compares favorably against its state-of-the-art alternatives that
operate on a single time-scale only.
Related papers
- On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - TMPQ-DM: Joint Timestep Reduction and Quantization Precision Selection for Efficient Diffusion Models [40.5153344875351]
We introduce TMPQ-DM, which jointly optimize timestep reduction and quantization to achieve a superior performance-efficiency trade-off.
For timestep reduction, we devise a non-uniform grouping scheme tailored to the non-uniform nature of the denoising process.
In terms of quantization, we adopt a fine-grained layer-wise approach to allocate varying bit-widths to different layers based on their respective contributions to the final generative performance.
arXiv Detail & Related papers (2024-04-15T07:51:40Z) - Neural Dynamical Operator: Continuous Spatial-Temporal Model with Gradient-Based and Derivative-Free Optimization Methods [0.0]
We present a data-driven modeling framework called neural dynamical operator that is continuous in both space and time.
A key feature of the neural dynamical operator is the resolution-invariance with respect to both spatial and temporal discretizations.
We show that the proposed model can better predict long-term statistics via the hybrid optimization scheme.
arXiv Detail & Related papers (2023-11-20T14:31:18Z) - Convolutional State Space Models for Long-Range Spatiotemporal Modeling [65.0993000439043]
ConvS5 is an efficient variant for long-rangetemporal modeling.
It significantly outperforms Transformers and ConvNISTTM on a long horizon Moving-Lab experiment while training 3X faster than ConvLSTM and generating samples 400X faster than Transformers.
arXiv Detail & Related papers (2023-10-30T16:11:06Z) - Fast Sampling of Diffusion Models via Operator Learning [74.37531458470086]
We use neural operators, an efficient method to solve the probability flow differential equations, to accelerate the sampling process of diffusion models.
Compared to other fast sampling methods that have a sequential nature, we are the first to propose a parallel decoding method.
We show our method achieves state-of-the-art FID of 3.78 for CIFAR-10 and 7.83 for ImageNet-64 in the one-model-evaluation setting.
arXiv Detail & Related papers (2022-11-24T07:30:27Z) - Applying Regularized Schr\"odinger-Bridge-Based Stochastic Process in
Generative Modeling [0.0]
This study tries to reduce the number of timesteps and training time required and proposed regularization terms to make bidirectional processes consistent with a reduced number of timesteps.
Applying this regularization to various tasks, the possibility of generative modeling based on a process with faster sampling speed could be confirmed.
arXiv Detail & Related papers (2022-08-15T11:52:33Z) - Semi-supervised Learning of Partial Differential Operators and Dynamical
Flows [68.77595310155365]
We present a novel method that combines a hyper-network solver with a Fourier Neural Operator architecture.
We test our method on various time evolution PDEs, including nonlinear fluid flows in one, two, and three spatial dimensions.
The results show that the new method improves the learning accuracy at the time point of supervision point, and is able to interpolate and the solutions to any intermediate time.
arXiv Detail & Related papers (2022-07-28T19:59:14Z) - Quantum dynamics simulations beyond the coherence time on NISQ hardware
by variational Trotter compression [0.0]
We demonstrate a post-quench dynamics simulation of a Heisenberg model on present-day IBM quantum hardware.
We show how to measure the required cost function, the overlap between the time-evolved and variational states, on present-day hardware.
In addition to carrying out simulations on real hardware, we investigate the performance and scaling behavior of the algorithm with noiseless and noisy classical simulations.
arXiv Detail & Related papers (2021-12-23T15:44:47Z) - Spatio-Temporal Variational Gaussian Processes [26.60276485130467]
We introduce a scalable approach to Gaussian process inference that combinestemporal-temporal filtering with natural variational inference.
We derive a sparse approximation that constructs a state-space model over a reduced set of inducing points.
We show that for separable Markov kernels the full sparse cases recover exactly the standard variational GP.
arXiv Detail & Related papers (2021-11-02T16:53:31Z) - Fast and differentiable simulation of driven quantum systems [58.720142291102135]
We introduce a semi-analytic method based on the Dyson expansion that allows us to time-evolve driven quantum systems much faster than standard numerical methods.
We show results of the optimization of a two-qubit gate using transmon qubits in the circuit QED architecture.
arXiv Detail & Related papers (2020-12-16T21:43:38Z) - Convolutional Tensor-Train LSTM for Spatio-temporal Learning [116.24172387469994]
We propose a higher-order LSTM model that can efficiently learn long-term correlations in the video sequence.
This is accomplished through a novel tensor train module that performs prediction by combining convolutional features across time.
Our results achieve state-of-the-art performance-art in a wide range of applications and datasets.
arXiv Detail & Related papers (2020-02-21T05:00:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.