DLGAN : Time Series Synthesis Based on Dual-Layer Generative Adversarial Networks
- URL: http://arxiv.org/abs/2508.21340v1
- Date: Fri, 29 Aug 2025 05:58:36 GMT
- Title: DLGAN : Time Series Synthesis Based on Dual-Layer Generative Adversarial Networks
- Authors: Xuan Hou, Shuhan Liu, Zhaohui Peng, Yaohui Chu, Yue Zhang, Yining Wang,
- Abstract summary: We propose a simple but effective generative model textbfDual-textbfLayer textbfGenerative textbfAdversarial textbfNetworks, named textbfDLGAN.<n>The model decomposes the time series generation process into two stages: sequence feature extraction and sequence reconstruction. First, these two stages form a complete time series autoencoder, enabling supervised learning on the original time series to ensure that the reconstruction process can restore the temporal dependencies of the sequence.
- Score: 13.345872524896722
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time series synthesis is an effective approach to ensuring the secure circulation of time series data. Existing time series synthesis methods typically perform temporal modeling based on random sequences to generate target sequences, which often struggle to ensure the temporal dependencies in the generated time series. Additionally, directly modeling temporal features on random sequences makes it challenging to accurately capture the feature information of the original time series. To address the above issues, we propose a simple but effective generative model \textbf{D}ual-\textbf{L}ayer \textbf{G}enerative \textbf{A}dversarial \textbf{N}etworks, named \textbf{DLGAN}. The model decomposes the time series generation process into two stages: sequence feature extraction and sequence reconstruction. First, these two stages form a complete time series autoencoder, enabling supervised learning on the original time series to ensure that the reconstruction process can restore the temporal dependencies of the sequence. Second, a Generative Adversarial Network (GAN) is used to generate synthetic feature vectors that align with the real-time sequence feature vectors, ensuring that the generator can capture the temporal features from real time series. Extensive experiments on four public datasets demonstrate the superiority of this model across various evaluation metrics.
Related papers
- TimeOmni-VL: Unified Models for Time Series Understanding and Generation [66.55423802406078]
Time Omni-VL is a vision-centric framework that unifies time series understanding and generation.<n>Time Omni-VL is the first to leverage time series understanding as an explicit control signal for high-fidelity generation.<n> Experiments confirm that this unified approach significantly improves both semantic understanding and numerical precision.
arXiv Detail & Related papers (2026-02-19T07:50:11Z) - TimeMar: Multi-Scale Autoregressive Modeling for Unconditional Time Series Generation [11.455232661227313]
We propose a structure-disentangled multiscale generation framework for time series.<n>Our approach encodes sequences into discrete tokens at multiple temporal resolutions.<n>We show that our approach produces higher-quality time series than existing methods.
arXiv Detail & Related papers (2026-01-16T11:00:05Z) - MFRS: A Multi-Frequency Reference Series Approach to Scalable and Accurate Time-Series Forecasting [51.94256702463408]
Time series predictability is derived from periodic characteristics at different frequencies.<n>We propose a novel time series forecasting method based on multi-frequency reference series correlation analysis.<n> Experiments on major open and synthetic datasets show state-of-the-art performance.
arXiv Detail & Related papers (2025-03-11T11:40:14Z) - Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a causal Transformer for unified time series forecasting.<n>Based on large-scale pre-training, Timer-XL achieves state-of-the-art zero-shot performance.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - PRformer: Pyramidal Recurrent Transformer for Multivariate Time Series Forecasting [82.03373838627606]
Self-attention mechanism in Transformer architecture requires positional embeddings to encode temporal order in time series prediction.
We argue that this reliance on positional embeddings restricts the Transformer's ability to effectively represent temporal sequences.
We present a model integrating PRE with a standard Transformer encoder, demonstrating state-of-the-art performance on various real-world datasets.
arXiv Detail & Related papers (2024-08-20T01:56:07Z) - TS-Diffusion: Generating Highly Complex Time Series with Diffusion
Models [12.646560434352478]
We consider a class of time series with three common bad properties, including sampling irregularities, missingness, and large feature-temporal dimensions.
We introduce a general model, TS-Diffusion, to process such complex time series.
We have conducted extensive experiments on multiple time-series datasets, demonstrating that TS-Diffusion achieves excellent results on both conventional and complex time series.
arXiv Detail & Related papers (2023-11-06T17:52:08Z) - Time-to-Pattern: Information-Theoretic Unsupervised Learning for
Scalable Time Series Summarization [7.294418916091012]
We introduce an approach to time series summarization called Time-to-Pattern (T2P)
T2P aims to find a set of diverse patterns that together encode the most salient information, following the notion of minimum description length.
Our synthetic and real-world experiments reveal that T2P discovers informative patterns, even in noisy and complex settings.
arXiv Detail & Related papers (2023-08-26T01:15:32Z) - GT-GAN: General Purpose Time Series Synthesis with Generative
Adversarial Networks [11.157586814297138]
We present a general purpose model capable of synthesizing regular and irregular time series data.
We design a generative adversarial network-based method, where many related techniques are carefully integrated into a single framework.
arXiv Detail & Related papers (2022-10-05T06:18:06Z) - HyperTime: Implicit Neural Representation for Time Series [131.57172578210256]
Implicit neural representations (INRs) have recently emerged as a powerful tool that provides an accurate and resolution-independent encoding of data.
In this paper, we analyze the representation of time series using INRs, comparing different activation functions in terms of reconstruction accuracy and training convergence speed.
We propose a hypernetwork architecture that leverages INRs to learn a compressed latent representation of an entire time series dataset.
arXiv Detail & Related papers (2022-08-11T14:05:51Z) - Towards Generating Real-World Time Series Data [52.51620668470388]
We propose a novel generative framework for time series data generation - RTSGAN.
RTSGAN learns an encoder-decoder module which provides a mapping between a time series instance and a fixed-dimension latent vector.
To generate time series with missing values, we further equip RTSGAN with an observation embedding layer and a decide-and-generate decoder.
arXiv Detail & Related papers (2021-11-16T11:31:37Z) - Time Series is a Special Sequence: Forecasting with Sample Convolution
and Interaction [9.449017120452675]
Time series is a special type of sequence data, a set of observations collected at even intervals of time and ordered chronologically.
Existing deep learning techniques use generic sequence models for time series analysis, which ignore some of its unique properties.
We propose a novel neural network architecture and apply it for the time series forecasting problem, wherein we conduct sample convolution and interaction at multiple resolutions for temporal modeling.
arXiv Detail & Related papers (2021-06-17T08:15:04Z) - Synergetic Learning of Heterogeneous Temporal Sequences for
Multi-Horizon Probabilistic Forecasting [48.8617204809538]
We propose Variational Synergetic Multi-Horizon Network (VSMHN), a novel deep conditional generative model.
To learn complex correlations across heterogeneous sequences, a tailored encoder is devised to combine the advances in deep point processes models and variational recurrent neural networks.
Our model can be trained effectively using variational inference and generates predictions with Monte-Carlo simulation.
arXiv Detail & Related papers (2021-01-31T11:00:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.