Multi-Task Dynamical Systems
- URL: http://arxiv.org/abs/2210.04023v1
- Date: Sat, 8 Oct 2022 13:37:55 GMT
- Title: Multi-Task Dynamical Systems
- Authors: Alex Bird, Christopher K. I. Williams, Christopher Hawthorne
- Abstract summary: Time series datasets are often composed of a variety of sequences from the same domain, but from different entities.
This paper describes the multi-task dynamical system (MTDS); a general methodology for extending multi-task learning (MTL) to time series models.
We apply the MTDS to motion-capture data of people walking in various styles using a multi-task recurrent neural network (RNN), and to patient drug-response data using a multi-task pharmacodynamic model.
- Score: 5.881614676989161
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Time series datasets are often composed of a variety of sequences from the
same domain, but from different entities, such as individuals, products, or
organizations. We are interested in how time series models can be specialized
to individual sequences (capturing the specific characteristics) while still
retaining statistical power by sharing commonalities across the sequences. This
paper describes the multi-task dynamical system (MTDS); a general methodology
for extending multi-task learning (MTL) to time series models. Our approach
endows dynamical systems with a set of hierarchical latent variables which can
modulate all model parameters. To our knowledge, this is a novel development of
MTL, and applies to time series both with and without control inputs. We apply
the MTDS to motion-capture data of people walking in various styles using a
multi-task recurrent neural network (RNN), and to patient drug-response data
using a multi-task pharmacodynamic model.
Related papers
- Generalized Prompt Tuning: Adapting Frozen Univariate Time Series Foundation Models for Multivariate Healthcare Time Series [3.9599054392856483]
Time series foundation models are pre-trained on large datasets and are able to achieve state-of-the-art performance in diverse tasks.
We propose a prompt-tuning-inspired fine-tuning technique, Gen-P-Tuning, that enables us to adapt an existing univariate time series foundation model.
We demonstrate the effectiveness of our fine-tuning approach against various baselines on two MIMIC classification tasks, and on influenza-like illness forecasting.
arXiv Detail & Related papers (2024-11-19T19:20:58Z) - TimeMixer++: A General Time Series Pattern Machine for Universal Predictive Analysis [17.09401448377127]
Time series analysis plays a critical role in numerous applications, supporting tasks such as forecasting, classification, anomaly detection, and imputation.
In this work, we present the time series pattern machine (TSPM), a model designed to excel in a broad range of time series tasks through powerful representation and pattern extraction capabilities.
arXiv Detail & Related papers (2024-10-21T14:06:53Z) - UniTST: Effectively Modeling Inter-Series and Intra-Series Dependencies for Multivariate Time Series Forecasting [98.12558945781693]
We propose a transformer-based model UniTST containing a unified attention mechanism on the flattened patch tokens.
Although our proposed model employs a simple architecture, it offers compelling performance as shown in our experiments on several datasets for time series forecasting.
arXiv Detail & Related papers (2024-06-07T14:39:28Z) - PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - Two-Stage Aggregation with Dynamic Local Attention for Irregular Time Series [14.883195365310705]
We introduce TADA, a Two-stage aggregation process with Dynamic local Attention to harmonize time-wise and feature-wise irregularities in time series.
TADA outperforms state-of-the-art methods on three real-world datasets.
arXiv Detail & Related papers (2023-11-13T20:54:52Z) - Diffusion Model is an Effective Planner and Data Synthesizer for
Multi-Task Reinforcement Learning [101.66860222415512]
Multi-Task Diffusion Model (textscMTDiff) is a diffusion-based method that incorporates Transformer backbones and prompt learning for generative planning and data synthesis.
For generative planning, we find textscMTDiff outperforms state-of-the-art algorithms across 50 tasks on Meta-World and 8 maps on Maze2D.
arXiv Detail & Related papers (2023-05-29T05:20:38Z) - Integrating Multimodal Data for Joint Generative Modeling of Complex Dynamics [6.848555909346641]
We provide an efficient framework to combine various sources of information for optimal reconstruction.
Our framework is fully textitgenerative, producing, after training, trajectories with the same geometrical and temporal structure as those of the ground truth system.
arXiv Detail & Related papers (2022-12-15T15:21:28Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Low Resource Multi-Task Sequence Tagging -- Revisiting Dynamic
Conditional Random Fields [67.51177964010967]
We compare different models for low resource multi-task sequence tagging that leverage dependencies between label sequences for different tasks.
We find that explicit modeling of inter-dependencies between task predictions outperforms single-task as well as standard multi-task models.
arXiv Detail & Related papers (2020-05-01T07:11:34Z) - Variational Hyper RNN for Sequence Modeling [69.0659591456772]
We propose a novel probabilistic sequence model that excels at capturing high variability in time series data.
Our method uses temporal latent variables to capture information about the underlying data pattern.
The efficacy of the proposed method is demonstrated on a range of synthetic and real-world sequential data.
arXiv Detail & Related papers (2020-02-24T19:30:32Z) - A Deep Structural Model for Analyzing Correlated Multivariate Time
Series [11.009809732645888]
We present a deep learning structural time series model which can handle correlated multivariate time series input.
The model explicitly learns/extracts the trend, seasonality, and event components.
We compare our model with several state-of-the-art methods through a comprehensive set of experiments on a variety of time series data sets.
arXiv Detail & Related papers (2020-01-02T18:48:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.