PCF-GAN: generating sequential data via the characteristic function of measures on the path space
- URL: http://arxiv.org/abs/2305.12511v2
- Date: Sat, 6 Apr 2024 18:44:44 GMT
- Title: PCF-GAN: generating sequential data via the characteristic function of measures on the path space
- Authors: Hang Lou, Siran Li, Hao Ni,
- Abstract summary: PCF-GAN is a novel GAN that incorporates the path characteristic function (PCF) as the principled representation of time series distribution into the discriminator.
We show that PCF-GAN consistently outperforms state-of-the-art baselines in both generation and reconstruction quality.
- Score: 3.9983665898166425
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Generating high-fidelity time series data using generative adversarial networks (GANs) remains a challenging task, as it is difficult to capture the temporal dependence of joint probability distributions induced by time-series data. Towards this goal, a key step is the development of an effective discriminator to distinguish between time series distributions. We propose the so-called PCF-GAN, a novel GAN that incorporates the path characteristic function (PCF) as the principled representation of time series distribution into the discriminator to enhance its generative performance. On the one hand, we establish theoretical foundations of the PCF distance by proving its characteristicity, boundedness, differentiability with respect to generator parameters, and weak continuity, which ensure the stability and feasibility of training the PCF-GAN. On the other hand, we design efficient initialisation and optimisation schemes for PCFs to strengthen the discriminative power and accelerate training efficiency. To further boost the capabilities of complex time series generation, we integrate the auto-encoder structure via sequential embedding into the PCF-GAN, which provides additional reconstruction functionality. Extensive numerical experiments on various datasets demonstrate the consistently superior performance of PCF-GAN over state-of-the-art baselines, in both generation and reconstruction quality. Code is available at https://github.com/DeepIntoStreams/PCF-GAN.
Related papers
- SeriesGAN: Time Series Generation via Adversarial and Autoregressive Learning [0.9374652839580181]
We introduce an advanced framework that integrates the advantages of an autoencoder-generated embedding space with the adversarial training dynamics of GANs.
This method employs two discriminators: one to specifically guide the generator and another to refine both the autoencoder's and generator's output.
Our framework excels at generating high-fidelity time series data, consistently outperforming existing state-of-the-art benchmarks.
arXiv Detail & Related papers (2024-10-28T16:49:03Z) - ChronoGAN: Supervised and Embedded Generative Adversarial Networks for Time Series Generation [0.9374652839580181]
We introduce a robust framework aimed at addressing and mitigating these issues effectively.
This framework integrates the benefits of an Autoencoder-generated embedding space with the adversarial training dynamics of GANs.
We introduce an early generation algorithm and an improved neural network architecture to enhance stability and ensure effective generalization across both short and long time series.
arXiv Detail & Related papers (2024-09-21T04:51:35Z) - PRformer: Pyramidal Recurrent Transformer for Multivariate Time Series Forecasting [82.03373838627606]
Self-attention mechanism in Transformer architecture requires positional embeddings to encode temporal order in time series prediction.
We argue that this reliance on positional embeddings restricts the Transformer's ability to effectively represent temporal sequences.
We present a model integrating PRE with a standard Transformer encoder, demonstrating state-of-the-art performance on various real-world datasets.
arXiv Detail & Related papers (2024-08-20T01:56:07Z) - High Rank Path Development: an approach of learning the filtration of stochastic processes [6.245824251614165]
We introduce a novel metric called High Rank PCF Distance (HRPCFD) for extended weak convergence.
We then show that such HRPCFD admits many favourable analytic properties which allows us to design an efficient algorithm for training HRPCFD from data and construct the HRPCF-GAN.
Our numerical experiments on both hypothesis testing and generative modelling validate the out-performance of our approach compared with several state-of-the-art methods.
arXiv Detail & Related papers (2024-05-23T13:20:47Z) - Analysis and Optimization of Wireless Federated Learning with Data
Heterogeneity [72.85248553787538]
This paper focuses on performance analysis and optimization for wireless FL, considering data heterogeneity, combined with wireless resource allocation.
We formulate the loss function minimization problem, under constraints on long-term energy consumption and latency, and jointly optimize client scheduling, resource allocation, and the number of local training epochs (CRE)
Experiments on real-world datasets demonstrate that the proposed algorithm outperforms other benchmarks in terms of the learning accuracy and energy consumption.
arXiv Detail & Related papers (2023-08-04T04:18:01Z) - CARD: Channel Aligned Robust Blend Transformer for Time Series
Forecasting [50.23240107430597]
We design a special Transformer, i.e., Channel Aligned Robust Blend Transformer (CARD for short), that addresses key shortcomings of CI type Transformer in time series forecasting.
First, CARD introduces a channel-aligned attention structure that allows it to capture both temporal correlations among signals.
Second, in order to efficiently utilize the multi-scale knowledge, we design a token blend module to generate tokens with different resolutions.
Third, we introduce a robust loss function for time series forecasting to alleviate the potential overfitting issue.
arXiv Detail & Related papers (2023-05-20T05:16:31Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - CLMFormer: Mitigating Data Redundancy to Revitalize Transformer-based
Long-Term Time Series Forecasting System [46.39662315849883]
Long-term time-series forecasting (LTSF) plays a crucial role in various practical applications.
Existing Transformer-based models, such as Fedformer and Informer, often achieve their best performances on validation sets after just a few epochs.
We propose a novel approach to address this issue by employing curriculum learning and introducing a memory-driven decoder.
arXiv Detail & Related papers (2022-07-16T04:05:15Z) - Principal Component Density Estimation for Scenario Generation Using
Normalizing Flows [62.997667081978825]
We propose a dimensionality-reducing flow layer based on the linear principal component analysis (PCA) that sets up the normalizing flow in a lower-dimensional space.
We train the resulting principal component flow (PCF) on data of PV and wind power generation as well as load demand in Germany in the years 2013 to 2015.
arXiv Detail & Related papers (2021-04-21T08:42:54Z) - Reciprocal Adversarial Learning via Characteristic Functions [12.961770002117142]
Generative adversarial nets (GANs) have become a preferred tool for tasks involving complicated distributions.
We show how to use the characteristic function (CF) to compare the distributions rather than their moments.
We then prove an equivalence between the embedded and data domains when a reciprocal exists, where we naturally develop the GAN in an auto-encoder structure.
This efficient structure uses only two modules, together with a simple training strategy, to achieve bi-directionally generating clear images.
arXiv Detail & Related papers (2020-06-15T14:04:55Z) - Learning Likelihoods with Conditional Normalizing Flows [54.60456010771409]
Conditional normalizing flows (CNFs) are efficient in sampling and inference.
We present a study of CNFs where the base density to output space mapping is conditioned on an input x, to model conditional densities p(y|x)
arXiv Detail & Related papers (2019-11-29T19:17:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.