Decoupling Local and Global Representations of Time Series
- URL: http://arxiv.org/abs/2202.02262v1
- Date: Fri, 4 Feb 2022 17:46:04 GMT
- Title: Decoupling Local and Global Representations of Time Series
- Authors: Sana Tonekaboni, Chun-Liang Li, Sercan Arik, Anna Goldenberg, Tomas
Pfister
- Abstract summary: We propose a novel generative approach for learning representations for the global and local factors of variation in time series.
In experiments, we demonstrate successful recovery of the true local and global variability factors on simulated data.
We believe that the proposed way of defining representations is beneficial for data modelling and yields better insights into the complexity of real-world data.
- Score: 38.73548222141307
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Real-world time series data are often generated from several sources of
variation. Learning representations that capture the factors contributing to
this variability enables a better understanding of the data via its underlying
generative process and improves performance on downstream machine learning
tasks. This paper proposes a novel generative approach for learning
representations for the global and local factors of variation in time series.
The local representation of each sample models non-stationarity over time with
a stochastic process prior, and the global representation of the sample encodes
the time-independent characteristics. To encourage decoupling between the
representations, we introduce counterfactual regularization that minimizes the
mutual information between the two variables. In experiments, we demonstrate
successful recovery of the true local and global variability factors on
simulated data, and show that representations learned using our method yield
superior performance on downstream tasks on real-world datasets. We believe
that the proposed way of defining representations is beneficial for data
modelling and yields better insights into the complexity of real-world data.
Related papers
- Uniting contrastive and generative learning for event sequences models [51.547576949425604]
This study investigates the integration of two self-supervised learning techniques - instance-wise contrastive learning and a generative approach based on restoring masked events in latent space.
Experiments conducted on several public datasets, focusing on sequence classification and next-event type prediction, show that the integrated method achieves superior performance compared to individual approaches.
arXiv Detail & Related papers (2024-08-19T13:47:17Z) - Extracting Interpretable Local and Global Representations from Attention
on Time Series [0.135975510645475]
This paper targets two transformer attention based interpretability methods working with local abstraction and global representation.
We distinguish local and global contexts, and provide a comprehensive framework for both general interpretation options.
arXiv Detail & Related papers (2023-09-16T00:51:49Z) - Federated Learning of Models Pre-Trained on Different Features with
Consensus Graphs [19.130197923214123]
Learning an effective global model on private and decentralized datasets has become an increasingly important challenge of machine learning.
We propose a feature fusion approach that extracts local representations from local models and incorporates them into a global representation that improves the prediction performance.
This paper presents solutions to these problems and demonstrates them in real-world applications on time series data such as power grids and traffic networks.
arXiv Detail & Related papers (2023-06-02T02:24:27Z) - Leveraging sparse and shared feature activations for disentangled
representation learning [112.22699167017471]
We propose to leverage knowledge extracted from a diversified set of supervised tasks to learn a common disentangled representation.
We validate our approach on six real world distribution shift benchmarks, and different data modalities.
arXiv Detail & Related papers (2023-04-17T01:33:24Z) - Integrating Local Real Data with Global Gradient Prototypes for
Classifier Re-Balancing in Federated Long-Tailed Learning [60.41501515192088]
Federated Learning (FL) has become a popular distributed learning paradigm that involves multiple clients training a global model collaboratively.
The data samples usually follow a long-tailed distribution in the real world, and FL on the decentralized and long-tailed data yields a poorly-behaved global model.
In this work, we integrate the local real data with the global gradient prototypes to form the local balanced datasets.
arXiv Detail & Related papers (2023-01-25T03:18:10Z) - Self-Supervised Time Series Representation Learning via Cross
Reconstruction Transformer [11.908755624411707]
Existing approaches mainly leverage the contrastive learning framework, which automatically learns to understand the similar and dissimilar data pairs.
We propose Cross Reconstruction Transformer (CRT) to solve the aforementioned problems in a unified way.
CRT achieves time series representation learning through a cross-domain dropping-reconstruction task.
arXiv Detail & Related papers (2022-05-20T02:15:14Z) - Exploiting Shared Representations for Personalized Federated Learning [54.65133770989836]
We propose a novel federated learning framework and algorithm for learning a shared data representation across clients and unique local heads for each client.
Our algorithm harnesses the distributed computational power across clients to perform many local-updates with respect to the low-dimensional local parameters for every update of the representation.
This result is of interest beyond federated learning to a broad class of problems in which we aim to learn a shared low-dimensional representation among data distributions.
arXiv Detail & Related papers (2021-02-14T05:36:25Z) - On the Transfer of Disentangled Representations in Realistic Settings [44.367245337475445]
We introduce a new high-resolution dataset with 1M simulated images and over 1,800 annotated real-world images.
We propose new architectures in order to scale disentangled representation learning to realistic high-resolution settings.
arXiv Detail & Related papers (2020-10-27T16:15:24Z) - Think Locally, Act Globally: Federated Learning with Local and Global
Representations [92.68484710504666]
Federated learning is a method of training models on private data distributed over multiple devices.
We propose a new federated learning algorithm that jointly learns compact local representations on each device.
We also evaluate on the task of personalized mood prediction from real-world mobile data where privacy is key.
arXiv Detail & Related papers (2020-01-06T12:40:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.