Time Series Source Separation with Slow Flows
- URL: http://arxiv.org/abs/2007.10182v1
- Date: Mon, 20 Jul 2020 15:15:27 GMT
- Title: Time Series Source Separation with Slow Flows
- Authors: Edouard Pineau, S\'ebastien Razakarivony, Thomas Bonald
- Abstract summary: We show that slow feature analysis (SFA) naturally fits into the flow-based models (FBM) framework, a type of invertible neural latent variable models.
Building upon recent advances on blind source separation, we show that such a fit makes the time series decomposition identifiable.
- Score: 5.953590600890215
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we show that slow feature analysis (SFA), a common time series
decomposition method, naturally fits into the flow-based models (FBM)
framework, a type of invertible neural latent variable models. Building upon
recent advances on blind source separation, we show that such a fit makes the
time series decomposition identifiable.
Related papers
- MoTM: Towards a Foundation Model for Time Series Imputation based on Continuous Modeling [0.0]
We propose a first step to fill a gap by leveraging implicit neural representations (INRs)<n>MoTM combines a basis of INRs, each trained independently on a distinct family of time series, with a ridge regressor that adapts to the observed context at inference.<n>We demonstrate robust in-domain and out-of-domain generalization across diverse imputation scenarios.
arXiv Detail & Related papers (2025-07-17T15:16:30Z) - Solving Inverse Problems with FLAIR [59.02385492199431]
Flow-based latent generative models are able to generate images with remarkable quality, even enabling text-to-image generation.<n>We present FLAIR, a novel training free variational framework that leverages flow-based generative models as a prior for inverse problems.<n>Results on standard imaging benchmarks demonstrate that FLAIR consistently outperforms existing diffusion- and flow-based methods in terms of reconstruction quality and sample diversity.
arXiv Detail & Related papers (2025-06-03T09:29:47Z) - Trajectory Generator Matching for Time Series [0.0]
We find new generators of SDEs and jump processes inspired by trajectory flow matching.<n>We can handle discontinuities of the underlying processes by parameterizing the jump kernel densities.<n>Unlike most other approaches, we are able to handle irregularly sampled time series.
arXiv Detail & Related papers (2025-05-29T07:56:32Z) - MFRS: A Multi-Frequency Reference Series Approach to Scalable and Accurate Time-Series Forecasting [51.94256702463408]
Time series predictability is derived from periodic characteristics at different frequencies.
We propose a novel time series forecasting method based on multi-frequency reference series correlation analysis.
Experiments on major open and synthetic datasets show state-of-the-art performance.
arXiv Detail & Related papers (2025-03-11T11:40:14Z) - Retrieval-Augmented Diffusion Models for Time Series Forecasting [19.251274915003265]
We propose a Retrieval- Augmented Time series Diffusion model (RATD)
RATD consists of two parts: an embedding-based retrieval process and a reference-guided diffusion model.
Our approach allows leveraging meaningful samples within the database to aid in sampling, thus maximizing the utilization of datasets.
arXiv Detail & Related papers (2024-10-24T13:14:39Z) - SigDiffusions: Score-Based Diffusion Models for Long Time Series via Log-Signature Embeddings [3.801509221714223]
We introduce SigDiffusion, a novel diffusion model operating on log-signatures of the data.
To recover a signal from its log-signature formulae, we provide new closed-form inversion formulae.
We show that combining SigDiffusion with these formulae results in highly realistic time series generation.
arXiv Detail & Related papers (2024-06-14T18:04:06Z) - A Study of Posterior Stability for Time-Series Latent Diffusion [59.41969496514184]
We first show that posterior collapse will reduce latent diffusion to a variational autoencoder (VAE), making it less expressive.
We then introduce a principled method: dependency measure, that quantifies the sensitivity of a recurrent decoder to input variables.
Building on our theoretical and empirical studies, we introduce a new framework that extends latent diffusion and has a stable posterior.
arXiv Detail & Related papers (2024-05-22T21:54:12Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Predictive Modeling in the Reservoir Kernel Motif Space [0.9217021281095907]
This work proposes a time series prediction method based on the kernel view of linear reservoirs.
We provide a geometric interpretation of our approach shedding light on how our approach is related to the core reservoir models.
Empirical experiments then compare predictive performances of our suggested model with those of recent state-of-art transformer based models.
arXiv Detail & Related papers (2024-05-11T16:12:25Z) - PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - ChiroDiff: Modelling chirographic data with Diffusion Models [132.5223191478268]
We introduce a powerful model-class namely "Denoising Diffusion Probabilistic Models" or DDPMs for chirographic data.
Our model named "ChiroDiff", being non-autoregressive, learns to capture holistic concepts and therefore remains resilient to higher temporal sampling rate.
arXiv Detail & Related papers (2023-04-07T15:17:48Z) - Markov Chain Monte Carlo for Continuous-Time Switching Dynamical Systems [26.744964200606784]
We propose a novel inference algorithm utilizing a Markov Chain Monte Carlo approach.
The presented Gibbs sampler allows to efficiently obtain samples from the exact continuous-time posterior processes.
arXiv Detail & Related papers (2022-05-18T09:03:00Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Time Adaptive Gaussian Model [0.913755431537592]
Our model is a generalization of state-of-the-art methods for the inference of temporal graphical models.
It performs pattern recognition by clustering data points in time; and, it finds probabilistic (and possibly causal) relationships among the observed variables.
arXiv Detail & Related papers (2021-02-02T00:28:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.