Group GAN
- URL: http://arxiv.org/abs/2205.13741v1
- Date: Fri, 27 May 2022 03:09:55 GMT
- Title: Group GAN
- Authors: Ali Seyfi, Jean-Francois Rajotte, Raymond T. Ng
- Abstract summary: We propose a novel framework that takes time series' common origin into account and favors inter-channel relationship preservation.
We demonstrate empirically that our method helps preserve channel correlations and that our synthetic data performs very well downstream tasks with medical and financial data.
- Score: 1.1786249372283564
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Generating multivariate time series is a promising approach for sharing
sensitive data in many medical, financial, and IoT applications. A common type
of multivariate time series originates from a single source such as the
biometric measurements from a medical patient. This leads to complex dynamical
patterns between individual time series that are hard to learn by typical
generation models such as GANs. There is valuable information in those patterns
that machine learning models can use to better classify, predict or perform
other downstream tasks. We propose a novel framework that takes time series'
common origin into account and favors inter-channel relationship preservation.
The two key points of our method are: 1) the individual time series are
generated from a common point in latent space and 2) a central discriminator
favors the preservation of inter-channel dynamics. We demonstrate empirically
that our method helps preserve channel correlations and that our synthetic data
performs very well downstream tasks with medical and financial data.
Related papers
- MedGNN: Towards Multi-resolution Spatiotemporal Graph Learning for Medical Time Series Classification [9.290150386783838]
We propose a Multi-resolution Spatio Graph Learning framework, MedGNN, for medical time series classification.
We first propose to construct multi-resolution adaptive graph structures to learn dynamic multi-scale embeddings.
We then propose Difference Attention Networks to operate self-attention mechanisms on the finite difference for temporal modeling.
arXiv Detail & Related papers (2025-02-06T21:34:54Z) - General Time-series Model for Universal Knowledge Representation of Multivariate Time-Series data [61.163542597764796]
We show that time series with different time granularities (or corresponding frequency resolutions) exhibit distinct joint distributions in the frequency domain.
A novel Fourier knowledge attention mechanism is proposed to enable learning time-aware representations from both the temporal and frequency domains.
An autoregressive blank infilling pre-training framework is incorporated to time series analysis for the first time, leading to a generative tasks agnostic pre-training strategy.
arXiv Detail & Related papers (2025-02-05T15:20:04Z) - Tackling Data Heterogeneity in Federated Time Series Forecasting [61.021413959988216]
Time series forecasting plays a critical role in various real-world applications, including energy consumption prediction, disease transmission monitoring, and weather forecasting.
Most existing methods rely on a centralized training paradigm, where large amounts of data are collected from distributed devices to a central cloud server.
We propose a novel framework, Fed-TREND, to address data heterogeneity by generating informative synthetic data as auxiliary knowledge carriers.
arXiv Detail & Related papers (2024-11-24T04:56:45Z) - Generalized Prompt Tuning: Adapting Frozen Univariate Time Series Foundation Models for Multivariate Healthcare Time Series [3.9599054392856483]
Time series foundation models are pre-trained on large datasets and are able to achieve state-of-the-art performance in diverse tasks.
We propose a prompt-tuning-inspired fine-tuning technique, Gen-P-Tuning, that enables us to adapt an existing univariate time series foundation model.
We demonstrate the effectiveness of our fine-tuning approach against various baselines on two MIMIC classification tasks, and on influenza-like illness forecasting.
arXiv Detail & Related papers (2024-11-19T19:20:58Z) - Towards Generalisable Time Series Understanding Across Domains [10.350643783811174]
We introduce a novel pre-training paradigm specifically designed to handle time series heterogeneity.
We propose a tokeniser with learnable domain signatures, a dual masking strategy, and a normalised cross-correlation loss.
Our code and pre-trained weights are available at https://www.oetu.com/oetu/otis.
arXiv Detail & Related papers (2024-10-09T17:09:30Z) - Robust Detection of Lead-Lag Relationships in Lagged Multi-Factor Models [61.10851158749843]
Key insights can be obtained by discovering lead-lag relationships inherent in the data.
We develop a clustering-driven methodology for robust detection of lead-lag relationships in lagged multi-factor models.
arXiv Detail & Related papers (2023-05-11T10:30:35Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Instance-wise Graph-based Framework for Multivariate Time Series
Forecasting [69.38716332931986]
We propose a simple yet efficient instance-wise graph-based framework to utilize the inter-dependencies of different variables at different time stamps.
The key idea of our framework is aggregating information from the historical time series of different variables to the current time series that we need to forecast.
arXiv Detail & Related papers (2021-09-14T07:38:35Z) - On Disentanglement in Gaussian Process Variational Autoencoders [3.403279506246879]
We introduce a class of models recently introduced that have been successful in different tasks on time series data.
Our model exploits the temporal structure of the data by modeling each latent channel with a GP prior and employing a structured variational distribution.
We provide evidence that we can learn meaningful disentangled representations on real-world medical time series data.
arXiv Detail & Related papers (2021-02-10T15:49:27Z) - Pay Attention to Evolution: Time Series Forecasting with Deep
Graph-Evolution Learning [33.79957892029931]
This work presents a novel neural network architecture for time-series forecasting.
We named our method Recurrent Graph Evolution Neural Network (ReGENN)
An extensive set of experiments was conducted comparing ReGENN with dozens of ensemble methods and classical statistical ones.
arXiv Detail & Related papers (2020-08-28T20:10:07Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.