StyleTime: Style Transfer for Synthetic Time Series Generation
- URL: http://arxiv.org/abs/2209.11306v1
- Date: Thu, 22 Sep 2022 20:42:19 GMT
- Title: StyleTime: Style Transfer for Synthetic Time Series Generation
- Authors: Yousef El-Laham, Svitlana Vyetrenko
- Abstract summary: We introduce the concept of stylized features for time series, which is directly related to the time series realism properties.
We propose a novel stylization algorithm, called StyleTime, that uses explicit feature extraction techniques to combine the underlying content (trend) of one time series with the style (distributional properties) of another.
- Score: 10.457423272041332
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural style transfer is a powerful computer vision technique that can
incorporate the artistic "style" of one image to the "content" of another. The
underlying theory behind the approach relies on the assumption that the style
of an image is represented by the Gram matrix of its features, which is
typically extracted from pre-trained convolutional neural networks (e.g.,
VGG-19). This idea does not straightforwardly extend to time series stylization
since notions of style for two-dimensional images are not analogous to notions
of style for one-dimensional time series. In this work, a novel formulation of
time series style transfer is proposed for the purpose of synthetic data
generation and enhancement. We introduce the concept of stylized features for
time series, which is directly related to the time series realism properties,
and propose a novel stylization algorithm, called StyleTime, that uses explicit
feature extraction techniques to combine the underlying content (trend) of one
time series with the style (distributional properties) of another. Further, we
discuss evaluation metrics, and compare our work to existing state-of-the-art
time series generation and augmentation schemes. To validate the effectiveness
of our methods, we use stylized synthetic data as a means for data augmentation
to improve the performance of recurrent neural network models on several
forecasting tasks.
Related papers
- ZePo: Zero-Shot Portrait Stylization with Faster Sampling [61.14140480095604]
This paper presents an inversion-free portrait stylization framework based on diffusion models that accomplishes content and style feature fusion in merely four sampling steps.
We propose a feature merging strategy to amalgamate redundant features in Consistency Features, thereby reducing the computational load of attention control.
arXiv Detail & Related papers (2024-08-10T08:53:41Z) - HiCAST: Highly Customized Arbitrary Style Transfer with Adapter Enhanced
Diffusion Models [84.12784265734238]
The goal of Arbitrary Style Transfer (AST) is injecting the artistic features of a style reference into a given image/video.
We propose HiCAST, which is capable of explicitly customizing the stylization results according to various source of semantic clues.
A novel learning objective is leveraged for video diffusion model training, which significantly improve cross-frame temporal consistency.
arXiv Detail & Related papers (2024-01-11T12:26:23Z) - Generative AI Model for Artistic Style Transfer Using Convolutional
Neural Networks [0.0]
Artistic style transfer involves fusing the content of one image with the artistic style of another to create unique visual compositions.
This paper presents a comprehensive overview of a novel technique for style transfer using Convolutional Neural Networks (CNNs)
arXiv Detail & Related papers (2023-10-27T16:21:17Z) - A Unified Arbitrary Style Transfer Framework via Adaptive Contrastive
Learning [84.8813842101747]
Unified Contrastive Arbitrary Style Transfer (UCAST) is a novel style representation learning and transfer framework.
We present an adaptive contrastive learning scheme for style transfer by introducing an input-dependent temperature.
Our framework consists of three key components, i.e., a parallel contrastive learning scheme for style representation and style transfer, a domain enhancement module for effective learning of style distribution, and a generative network for style transfer.
arXiv Detail & Related papers (2023-03-09T04:35:00Z) - Learning Graph Neural Networks for Image Style Transfer [131.73237185888215]
State-of-the-art parametric and non-parametric style transfer approaches are prone to either distorted local style patterns due to global statistics alignment, or unpleasing artifacts resulting from patch mismatching.
In this paper, we study a novel semi-parametric neural style transfer framework that alleviates the deficiency of both parametric and non-parametric stylization.
arXiv Detail & Related papers (2022-07-24T07:41:31Z) - Domain Enhanced Arbitrary Image Style Transfer via Contrastive Learning [84.8813842101747]
Contrastive Arbitrary Style Transfer (CAST) is a new style representation learning and style transfer method via contrastive learning.
Our framework consists of three key components, i.e., a multi-layer style projector for style code encoding, a domain enhancement module for effective learning of style distribution, and a generative network for image style transfer.
arXiv Detail & Related papers (2022-05-19T13:11:24Z) - STALP: Style Transfer with Auxiliary Limited Pairing [36.23393954839379]
We present an approach to example-based stylization of images that uses a single pair of a source image and its stylized counterpart.
We demonstrate how to train an image translation network that can perform real-time semantically meaningful style transfer to a set of target images.
arXiv Detail & Related papers (2021-10-20T11:38:41Z) - In the light of feature distributions: moment matching for Neural Style
Transfer [27.25600860698314]
Style transfer aims to render the content of a given image in the graphical/artistic style of another image.
We show that most current implementations of that concept have important theoretical and practical limitations.
We propose a novel approach that matches the desired style more precisely, while still being computationally efficient.
arXiv Detail & Related papers (2021-03-12T11:00:44Z) - Parameter-Free Style Projection for Arbitrary Style Transfer [64.06126075460722]
This paper proposes a new feature-level style transformation technique, named Style Projection, for parameter-free, fast, and effective content-style transformation.
This paper further presents a real-time feed-forward model to leverage Style Projection for arbitrary image style transfer.
arXiv Detail & Related papers (2020-03-17T13:07:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.