CODA: Temporal Domain Generalization via Concept Drift Simulator
- URL: http://arxiv.org/abs/2310.01508v1
- Date: Mon, 2 Oct 2023 18:04:34 GMT
- Title: CODA: Temporal Domain Generalization via Concept Drift Simulator
- Authors: Chia-Yuan Chang, Yu-Neng Chuang, Zhimeng Jiang, Kwei-Herng Lai, Anxiao
Jiang, Na Zou
- Abstract summary: In real-world applications, machine learning models often become obsolete due to shifts in the joint distribution arising from underlying temporal trends.
We propose the COncept Drift simulAtor framework incorporating a predicted feature correlation matrix to simulate future data for model training.
- Score: 34.21255368783787
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In real-world applications, machine learning models often become obsolete due
to shifts in the joint distribution arising from underlying temporal trends, a
phenomenon known as the "concept drift". Existing works propose model-specific
strategies to achieve temporal generalization in the near-future domain.
However, the diverse characteristics of real-world datasets necessitate
customized prediction model architectures. To this end, there is an urgent
demand for a model-agnostic temporal domain generalization approach that
maintains generality across diverse data modalities and architectures. In this
work, we aim to address the concept drift problem from a data-centric
perspective to bypass considering the interaction between data and model.
Developing such a framework presents non-trivial challenges: (i) existing
generative models struggle to generate out-of-distribution future data, and
(ii) precisely capturing the temporal trends of joint distribution along
chronological source domains is computationally infeasible. To tackle the
challenges, we propose the COncept Drift simulAtor (CODA) framework
incorporating a predicted feature correlation matrix to simulate future data
for model training. Specifically, CODA leverages feature correlations to
represent data characteristics at specific time points, thereby circumventing
the daunting computational costs. Experimental results demonstrate that using
CODA-generated data as training input effectively achieves temporal domain
generalization across different model architectures.
Related papers
- Tackling Data Heterogeneity in Federated Time Series Forecasting [61.021413959988216]
Time series forecasting plays a critical role in various real-world applications, including energy consumption prediction, disease transmission monitoring, and weather forecasting.
Most existing methods rely on a centralized training paradigm, where large amounts of data are collected from distributed devices to a central cloud server.
We propose a novel framework, Fed-TREND, to address data heterogeneity by generating informative synthetic data as auxiliary knowledge carriers.
arXiv Detail & Related papers (2024-11-24T04:56:45Z) - Learning Divergence Fields for Shift-Robust Graph Representations [73.11818515795761]
In this work, we propose a geometric diffusion model with learnable divergence fields for the challenging problem with interdependent data.
We derive a new learning objective through causal inference, which can guide the model to learn generalizable patterns of interdependence that are insensitive across domains.
arXiv Detail & Related papers (2024-06-07T14:29:21Z) - A Survey on Diffusion Models for Time Series and Spatio-Temporal Data [92.1255811066468]
We review the use of diffusion models in time series and S-temporal data, categorizing them by model, task type, data modality, and practical application domain.
We categorize diffusion models into unconditioned and conditioned types discuss time series and S-temporal data separately.
Our survey covers their application extensively in various fields including healthcare, recommendation, climate, energy, audio, and transportation.
arXiv Detail & Related papers (2024-04-29T17:19:40Z) - COOL: A Conjoint Perspective on Spatio-Temporal Graph Neural Network for
Traffic Forecasting [10.392021668859272]
This paper proposes Conjoint Spatio-Temporal graph neural network (abbreviated as COOL), which models heterogeneous graphs from prior and posterior information to conjointly capture high-order-temporal relationships.
To capture diverse transitional properties to enhance traffic forecasting, we propose a conjoint-attention decoder that models diverse temporal patterns from both multi-rank and multi-scale views.
arXiv Detail & Related papers (2024-03-02T04:30:09Z) - PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - Towards Theoretical Understandings of Self-Consuming Generative Models [56.84592466204185]
This paper tackles the emerging challenge of training generative models within a self-consuming loop.
We construct a theoretical framework to rigorously evaluate how this training procedure impacts the data distributions learned by future models.
We present results for kernel density estimation, delivering nuanced insights such as the impact of mixed data training on error propagation.
arXiv Detail & Related papers (2024-02-19T02:08:09Z) - A Temporally Disentangled Contrastive Diffusion Model for Spatiotemporal Imputation [35.46631415365955]
We introduce a conditional diffusion framework called C$2$TSD, which incorporates disentangled temporal (trend and seasonality) representations as conditional information.
Our experiments on three real-world datasets demonstrate the superior performance of our approach compared to a number of state-of-the-art baselines.
arXiv Detail & Related papers (2024-02-18T11:59:04Z) - Explainable Parallel RCNN with Novel Feature Representation for Time
Series Forecasting [0.0]
Time series forecasting is a fundamental challenge in data science.
We develop a parallel deep learning framework composed of RNN and CNN.
Extensive experiments on three datasets reveal the effectiveness of our method.
arXiv Detail & Related papers (2023-05-08T17:20:13Z) - Enhancing the Robustness via Adversarial Learning and Joint
Spatial-Temporal Embeddings in Traffic Forecasting [11.680589359294972]
We propose TrendGCN to address the challenge of balancing dynamics and robustness.
Our model simultaneously incorporates spatial (node-wise) embeddings and temporal (time-wise) embeddings to account for heterogeneous space-and-time convolutions.
Compared with traditional approaches that handle step-wise predictive errors independently, our approach can produce more realistic and robust forecasts.
arXiv Detail & Related papers (2022-08-05T09:36:55Z) - Temporal Domain Generalization with Drift-Aware Dynamic Neural Network [12.483886657900525]
We propose a Temporal Domain Generalization with Drift-Aware Dynamic Neural Network (DRAIN) framework.
Specifically, we formulate the problem into a Bayesian framework that jointly models the relation between data and model dynamics.
It captures the temporal drift of model parameters and data distributions and can predict models in the future without the presence of future data.
arXiv Detail & Related papers (2022-05-21T20:01:31Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.