FlowTS: Time Series Generation via Rectified Flow
- URL: http://arxiv.org/abs/2411.07506v3
- Date: Sat, 08 Feb 2025 14:19:45 GMT
- Title: FlowTS: Time Series Generation via Rectified Flow
- Authors: Yang Hu, Xiao Wang, Zezhen Ding, Lirong Wu, Huatian Zhang, Stan Z. Li, Sheng Wang, Jiheng Zhang, Ziyun Li, Tianlong Chen,
- Abstract summary: FlowTS is an ODE-based model that leverages rectified flow with straight-line transport in probability space.
For unconditional setting, FlowTS achieves state-of-the-art performance, with context FID scores of 0.019 and 0.011 on Stock and ETTh datasets.
For conditional setting, we have achieved superior performance in solar forecasting.
- Score: 67.41208519939626
- License:
- Abstract: Diffusion-based models have significant achievements in time series generation but suffer from inefficient computation: solving high-dimensional ODEs/SDEs via iterative numerical solvers demands hundreds to thousands of drift function evaluations per sample, incurring prohibitive costs. To resolve this, we propose FlowTS, an ODE-based model that leverages rectified flow with straight-line transport in probability space. By learning geodesic paths between distributions, FlowTS achieves computational efficiency through exact linear trajectory simulation, accelerating training and generation while improving performances. We further introduce an adaptive sampling strategy inspired by the exploration-exploitation trade-off, balancing noise adaptation and precision. Notably, FlowTS enables seamless adaptation from unconditional to conditional generation without retraining, ensuring efficient real-world deployment. Also, to enhance generation authenticity, FlowTS integrates trend and seasonality decomposition, attention registers (for global context aggregation), and Rotary Position Embedding (RoPE) (for position information). For unconditional setting, extensive experiments demonstrate that FlowTS achieves state-of-the-art performance, with context FID scores of 0.019 and 0.011 on Stock and ETTh datasets (prev. best: 0.067, 0.061). For conditional setting, we have achieved superior performance in solar forecasting (MSE 213, prev. best: 375) and MuJoCo imputation tasks (MSE 7e-5, prev. best 2.7e-4). The code is available at https://github.com/UNITES-Lab/FlowTS.
Related papers
- FlowDAS: A Flow-Based Framework for Data Assimilation [15.64941169350615]
FlowDAS is a novel generative model-based framework using the interpolants to unify the learning of state transition dynamics and generative priors.
Our experiments demonstrate FlowDAS's superior performance on various benchmarks, from the Lorenz system to high-dimensional fluid superresolution tasks.
arXiv Detail & Related papers (2025-01-13T05:03:41Z) - Comparison of Generative Learning Methods for Turbulence Modeling [1.2499537119440245]
High resolution techniques such as Direct Numerical Simulation (DNS) and Large Eddy Simulation (LES) are generally not computationally affordable.
Recent advances in machine learning, specifically in generative probabilistic models, offer promising alternatives for turbulence modeling.
This paper investigates the application of three generative models - Variational Autoencoders (VAE), Deep Conversaal Generative Adversarial Networks (DCGAN), and Denoising Diffusion Probabilistic Models (DDPM)
arXiv Detail & Related papers (2024-11-25T14:20:53Z) - SUDS: A Strategy for Unsupervised Drift Sampling [0.5437605013181142]
Supervised machine learning encounters concept drift, where the data distribution changes over time, degrading performance.
We present the Strategy for Drift Sampling (SUDS), a novel method that selects homogeneous samples for retraining using existing drift detection algorithms.
Our results demonstrate the efficacy of SUDS in optimizing labeled data use in dynamic environments.
arXiv Detail & Related papers (2024-11-05T10:55:29Z) - Optimizing Diffusion Models for Joint Trajectory Prediction and Controllable Generation [49.49868273653921]
Diffusion models are promising for joint trajectory prediction and controllable generation in autonomous driving.
We introduce Optimal Gaussian Diffusion (OGD) and Estimated Clean Manifold (ECM) Guidance.
Our methodology streamlines the generative process, enabling practical applications with reduced computational overhead.
arXiv Detail & Related papers (2024-08-01T17:59:59Z) - Boundary-aware Decoupled Flow Networks for Realistic Extreme Rescaling [49.215957313126324]
Recently developed generative methods, including invertible rescaling network (IRN) based and generative adversarial network (GAN) based methods, have demonstrated exceptional performance in image rescaling.
However, IRN-based methods tend to produce over-smoothed results, while GAN-based methods easily generate fake details.
We propose Boundary-aware Decoupled Flow Networks (BDFlow) to generate realistic and visually pleasing results.
arXiv Detail & Related papers (2024-05-05T14:05:33Z) - Language Rectified Flow: Advancing Diffusion Language Generation with Probabilistic Flows [53.31856123113228]
This paper proposes Language Rectified Flow (ours)
Our method is based on the reformulation of the standard probabilistic flow models.
Experiments and ablation studies demonstrate that our method can be general, effective, and beneficial for many NLP tasks.
arXiv Detail & Related papers (2024-03-25T17:58:22Z) - AdaFlow: Imitation Learning with Variance-Adaptive Flow-Based Policies [21.024480978703288]
We propose AdaFlow, an imitation learning framework based on flow-based generative modeling.
AdaFlow represents the policy with state-conditioned ordinary differential equations (ODEs)
We show that AdaFlow achieves high performance with fast inference speed.
arXiv Detail & Related papers (2024-02-06T10:15:38Z) - Guided Flows for Generative Modeling and Decision Making [55.42634941614435]
We show that Guided Flows significantly improves the sample quality in conditional image generation and zero-shot text synthesis-to-speech.
Notably, we are first to apply flow models for plan generation in the offline reinforcement learning setting ax speedup in compared to diffusion models.
arXiv Detail & Related papers (2023-11-22T15:07:59Z) - Diffusion Generative Flow Samplers: Improving learning signals through
partial trajectory optimization [87.21285093582446]
Diffusion Generative Flow Samplers (DGFS) is a sampling-based framework where the learning process can be tractably broken down into short partial trajectory segments.
Our method takes inspiration from the theory developed for generative flow networks (GFlowNets)
arXiv Detail & Related papers (2023-10-04T09:39:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.