Time Series Source Separation with Slow Flows
- URL: http://arxiv.org/abs/2007.10182v1
- Date: Mon, 20 Jul 2020 15:15:27 GMT
- Title: Time Series Source Separation with Slow Flows
- Authors: Edouard Pineau, S\'ebastien Razakarivony, Thomas Bonald
- Abstract summary: We show that slow feature analysis (SFA) naturally fits into the flow-based models (FBM) framework, a type of invertible neural latent variable models.
Building upon recent advances on blind source separation, we show that such a fit makes the time series decomposition identifiable.
- Score: 5.953590600890215
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we show that slow feature analysis (SFA), a common time series
decomposition method, naturally fits into the flow-based models (FBM)
framework, a type of invertible neural latent variable models. Building upon
recent advances on blind source separation, we show that such a fit makes the
time series decomposition identifiable.
Related papers
- SigDiffusions: Score-Based Diffusion Models for Long Time Series via Log-Signature Embeddings [3.801509221714223]
We introduce SigDiffusion, a novel diffusion model operating on log-signatures of the data.
To recover a signal from its log-signature formulae, we provide new closed-form inversion formulae.
We show that combining SigDiffusion with these formulae results in highly realistic time series generation.
arXiv Detail & Related papers (2024-06-14T18:04:06Z) - Adversarial Schrödinger Bridge Matching [66.39774923893103]
Iterative Markovian Fitting (IMF) procedure alternates between Markovian and reciprocal projections of continuous-time processes.
We propose a novel Discrete-time IMF (D-IMF) procedure in which learning of processes is replaced by learning just a few transition probabilities in discrete time.
We show that our D-IMF procedure can provide the same quality of unpaired domain translation as the IMF, using only several generation steps instead of hundreds.
arXiv Detail & Related papers (2024-05-23T11:29:33Z) - A Study of Posterior Stability for Time-Series Latent Diffusion [65.95306174480034]
We first explain that posterior collapse reduces latent diffusion to a VAE, making it less expressive.
We introduce the notion of dependency measures, showing that the latent variable sampled from the diffusion model loses control of the generation process.
We also analyze the causes of posterior collapse and introduce a new framework based on this analysis, which addresses the problem and supports a more expressive prior distribution.
arXiv Detail & Related papers (2024-05-22T21:54:12Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Predictive Modeling in the Reservoir Kernel Motif Space [0.9217021281095907]
This work proposes a time series prediction method based on the kernel view of linear reservoirs.
We provide a geometric interpretation of our approach shedding light on how our approach is related to the core reservoir models.
Empirical experiments then compare predictive performances of our suggested model with those of recent state-of-art transformer based models.
arXiv Detail & Related papers (2024-05-11T16:12:25Z) - PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - Markov Chain Monte Carlo for Continuous-Time Switching Dynamical Systems [26.744964200606784]
We propose a novel inference algorithm utilizing a Markov Chain Monte Carlo approach.
The presented Gibbs sampler allows to efficiently obtain samples from the exact continuous-time posterior processes.
arXiv Detail & Related papers (2022-05-18T09:03:00Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Time Adaptive Gaussian Model [0.913755431537592]
Our model is a generalization of state-of-the-art methods for the inference of temporal graphical models.
It performs pattern recognition by clustering data points in time; and, it finds probabilistic (and possibly causal) relationships among the observed variables.
arXiv Detail & Related papers (2021-02-02T00:28:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.