How to Unlock Time Series Editing? Diffusion-Driven Approach with Multi-Grained Control
- URL: http://arxiv.org/abs/2506.05276v1
- Date: Thu, 05 Jun 2025 17:32:00 GMT
- Title: How to Unlock Time Series Editing? Diffusion-Driven Approach with Multi-Grained Control
- Authors: Hao Yu, Chu Xin Cheng, Runlong Yu, Yuyang Ye, Shiwei Tong, Zhaofeng Liu, Defu Lian,
- Abstract summary: Time Series Editing (TSE) makes precise modifications while preserving temporal coherence.<n>We introduce the CocktailEdit framework to enable simultaneous, flexible control across different types of constraints.
- Score: 28.81619544175742
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent advances in time series generation have shown promise, yet controlling properties in generated sequences remains challenging. Time Series Editing (TSE) - making precise modifications while preserving temporal coherence - consider both point-level constraints and segment-level controls that current methods struggle to provide. We introduce the CocktailEdit framework to enable simultaneous, flexible control across different types of constraints. This framework combines two key mechanisms: a confidence-weighted anchor control for point-wise constraints and a classifier-based control for managing statistical properties such as sums and averages over segments. Our methods achieve precise local control during the denoising inference stage while maintaining temporal coherence and integrating seamlessly, with any conditionally trained diffusion-based time series models. Extensive experiments across diverse datasets and models demonstrate its effectiveness. Our work bridges the gap between pure generative modeling and real-world time series editing needs, offering a flexible solution for human-in-the-loop time series generation and editing. The code and demo are provided for validation.
Related papers
- CtrlDiff: Boosting Large Diffusion Language Models with Dynamic Block Prediction and Controllable Generation [7.250878248686215]
Diffusion-based language models have emerged as a compelling alternative due to their powerful parallel generation capabilities and inherent editability.<n>We propose CtrlDiff, a dynamic and controllable semi-autoregressive framework that adaptively determines the size of each generation block based on local semantics.
arXiv Detail & Related papers (2025-05-20T14:52:41Z) - TimeDART: A Diffusion Autoregressive Transformer for Self-Supervised Time Series Representation [47.58016750718323]
We propose TimeDART, a novel self-supervised time series pre-training framework.<n>TimeDART unifies two powerful generative paradigms to learn more transferable representations.<n>We conduct extensive experiments on public datasets for time series forecasting and classification.
arXiv Detail & Related papers (2024-10-08T06:08:33Z) - Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a causal Transformer for unified time series forecasting.<n>Based on large-scale pre-training, Timer-XL achieves state-of-the-art zero-shot performance.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - Temporal Feature Matters: A Framework for Diffusion Model Quantization [105.3033493564844]
Diffusion models rely on the time-step for the multi-round denoising.<n>We introduce a novel quantization framework that includes three strategies.<n>This framework preserves most of the temporal information and ensures high-quality end-to-end generation.
arXiv Detail & Related papers (2024-07-28T17:46:15Z) - TSLANet: Rethinking Transformers for Time Series Representation Learning [19.795353886621715]
Time series data is characterized by its intrinsic long and short-range dependencies.
We introduce a novel Time Series Lightweight Network (TSLANet) as a universal convolutional model for diverse time series tasks.
Our experiments demonstrate that TSLANet outperforms state-of-the-art models in various tasks spanning classification, forecasting, and anomaly detection.
arXiv Detail & Related papers (2024-04-12T13:41:29Z) - ConvTimeNet: A Deep Hierarchical Fully Convolutional Model for Multivariate Time Series Analysis [7.979501926410114]
ConvTimeNet is a hierarchical pure convolutional model designed for time series analysis.<n>It adaptively perceives local patterns of temporally dependent basic units in a data-driven manner.<n>A large kernel mechanism is employed to ensure that convolutional blocks can be deeply stacked.
arXiv Detail & Related papers (2024-03-03T12:05:49Z) - Anticipatory Music Transformer [60.15347393822849]
We introduce anticipation: a method for constructing a controllable generative model of a temporal point process.
We focus on infilling control tasks, whereby the controls are a subset of the events themselves.
We train anticipatory infilling models using the large and diverse Lakh MIDI music dataset.
arXiv Detail & Related papers (2023-06-14T16:27:53Z) - Robust Detection of Lead-Lag Relationships in Lagged Multi-Factor Models [61.10851158749843]
Key insights can be obtained by discovering lead-lag relationships inherent in the data.
We develop a clustering-driven methodology for robust detection of lead-lag relationships in lagged multi-factor models.
arXiv Detail & Related papers (2023-05-11T10:30:35Z) - Continuous-time convolutions model of event sequences [46.3471121117337]
Event sequences are non-uniform and sparse, making traditional models unsuitable.
We propose COTIC, a method based on an efficient convolution neural network designed to handle the non-uniform occurrence of events over time.
COTIC outperforms existing models in predicting the next event time and type, achieving an average rank of 1.5 compared to 3.714 for the nearest competitor.
arXiv Detail & Related papers (2023-02-13T10:34:51Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Policy Analysis using Synthetic Controls in Continuous-Time [101.35070661471124]
Counterfactual estimation using synthetic controls is one of the most successful recent methodological developments in causal inference.
We propose a continuous-time alternative that models the latent counterfactual path explicitly using the formalism of controlled differential equations.
arXiv Detail & Related papers (2021-02-02T16:07:39Z) - Generation and storage of spin squeezing via learning-assisted optimal
control [7.460567829296081]
We consider a collective spin system coupled to a bosonic field, and show that proper constant-value controls in this model can simulate the dynamical behaviors of these two models.
A better performance of squeezing can be obtained when the control is time-varying, which is generated via a reinforcement learning algorithm.
We propose a four-step strategy for the construction of a new type of combined controls, which include both constant-value and time-varying controls, but performed at different time intervals.
arXiv Detail & Related papers (2020-10-26T09:31:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.