A Survey of Transformer Enabled Time Series Synthesis
- URL: http://arxiv.org/abs/2406.02322v1
- Date: Tue, 4 Jun 2024 13:52:42 GMT
- Title: A Survey of Transformer Enabled Time Series Synthesis
- Authors: Alexander Sommers, Logan Cummins, Sudip Mittal, Shahram Rahimi, Maria Seale, Joseph Jaboure, Thomas Arnold,
- Abstract summary: Generative AI has received much attention in the image and language domains.
The present survey identifies this gap at the intersection of the transformer, generative AI, and time series data.
The reviewed works show great variety in approach, and have not yet converged on a conclusive answer to the problems the domain poses.
- Score: 38.897055626205464
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Generative AI has received much attention in the image and language domains, with the transformer neural network continuing to dominate the state of the art. Application of these models to time series generation is less explored, however, and is of great utility to machine learning, privacy preservation, and explainability research. The present survey identifies this gap at the intersection of the transformer, generative AI, and time series data, and reviews works in this sparsely populated subdomain. The reviewed works show great variety in approach, and have not yet converged on a conclusive answer to the problems the domain poses. GANs, diffusion models, state space models, and autoencoders were all encountered alongside or surrounding the transformers which originally motivated the survey. While too open a domain to offer conclusive insights, the works surveyed are quite suggestive, and several recommendations for best practice, and suggestions of valuable future work, are provided.
Related papers
- Online Model-based Anomaly Detection in Multivariate Time Series: Taxonomy, Survey, Research Challenges and Future Directions [0.017476232824732776]
Time-series anomaly detection plays an important role in engineering processes.
This survey introduces a novel taxonomy where a distinction between online and offline, and training and inference is made.
It presents the most popular data sets and evaluation metrics used in the literature, as well as a detailed analysis.
arXiv Detail & Related papers (2024-08-07T13:01:10Z) - State-Space Modeling in Long Sequence Processing: A Survey on Recurrence in the Transformer Era [59.279784235147254]
This survey provides an in-depth summary of the latest approaches that are based on recurrent models for sequential data processing.
The emerging picture suggests that there is room for thinking of novel routes, constituted by learning algorithms which depart from the standard Backpropagation Through Time.
arXiv Detail & Related papers (2024-06-13T12:51:22Z) - On the Resurgence of Recurrent Models for Long Sequences -- Survey and
Research Opportunities in the Transformer Era [59.279784235147254]
This survey is aimed at providing an overview of these trends framed under the unifying umbrella of Recurrence.
It emphasizes novel research opportunities that become prominent when abandoning the idea of processing long sequences.
arXiv Detail & Related papers (2024-02-12T23:55:55Z) - Transformer for Object Re-Identification: A Survey [69.61542572894263]
Vision Transformers have spurred a growing number of studies delving deeper into Transformer-based Re-ID.
This paper provides a comprehensive review and in-depth analysis of the Transformer-based Re-ID.
Considering the trending unsupervised Re-ID, we propose a new Transformer baseline, UntransReID, achieving state-of-the-art performance.
arXiv Detail & Related papers (2024-01-13T03:17:57Z) - A Comprehensive Survey on Applications of Transformers for Deep Learning
Tasks [60.38369406877899]
Transformer is a deep neural network that employs a self-attention mechanism to comprehend the contextual relationships within sequential data.
transformer models excel in handling long dependencies between input sequence elements and enable parallel processing.
Our survey encompasses the identification of the top five application domains for transformer-based models.
arXiv Detail & Related papers (2023-06-11T23:13:51Z) - Two Steps Forward and One Behind: Rethinking Time Series Forecasting
with Deep Learning [7.967995669387532]
The Transformer is a highly successful deep learning model that has revolutionised the world of artificial neural networks.
We investigate the effectiveness of Transformer-based models applied to the domain of time series forecasting.
We propose a set of alternative models that are better performing and significantly less complex.
arXiv Detail & Related papers (2023-04-10T12:47:42Z) - Transformers in Time Series: A Survey [66.50847574634726]
We systematically review Transformer schemes for time series modeling by highlighting their strengths as well as limitations.
From the perspective of network structure, we summarize the adaptations and modifications that have been made to Transformers.
From the perspective of applications, we categorize time series Transformers based on common tasks including forecasting, anomaly detection, and classification.
arXiv Detail & Related papers (2022-02-15T01:43:27Z) - Transformers predicting the future. Applying attention in next-frame and
time series forecasting [0.0]
Recurrent Neural Networks were, until recently, one of the best ways to capture the timely dependencies in sequences.
With the introduction of the Transformer, it has been proven that an architecture with only attention-mechanisms without any RNN can improve on the results in various sequence processing tasks.
arXiv Detail & Related papers (2021-08-18T16:17:29Z) - Generative adversarial networks in time series: A survey and taxonomy [7.885673762715387]
Generative adversarial networks (GANs) studies have grown exponentially in the past few years.
GAN applications have diversified across disciplines such as time series and sequence generation.
As a relatively new niche for GANs, fieldwork is ongoing to develop high quality, diverse and private time series data.
arXiv Detail & Related papers (2021-07-23T09:38:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.