CORAL: Concept Drift Representation Learning for Co-evolving Time-series
- URL: http://arxiv.org/abs/2501.01480v3
- Date: Fri, 31 Jan 2025 18:13:14 GMT
- Title: CORAL: Concept Drift Representation Learning for Co-evolving Time-series
- Authors: Kunpeng Xu, Lifei Chen, Shengrui Wang,
- Abstract summary: Concept drift affects the reliability and accuracy of conventional analysis models.<n>This paper presents CORAL, a method that models time series as an evolving ecosystem to learn representations of concept drift.
- Score: 6.4314326272535896
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: In the realm of time series analysis, tackling the phenomenon of concept drift poses a significant challenge. Concept drift -- characterized by the evolving statistical properties of time series data, affects the reliability and accuracy of conventional analysis models. This is particularly evident in co-evolving scenarios where interactions among variables are crucial. This paper presents CORAL, a simple yet effective method that models time series as an evolving ecosystem to learn representations of concept drift. CORAL employs a kernel-induced self-representation learning to generate a representation matrix, encapsulating the inherent dynamics of co-evolving time series. This matrix serves as a key tool for identification and adaptation to concept drift by observing its temporal variations. Furthermore, CORAL effectively identifies prevailing patterns and offers insights into emerging trends through pattern evolution analysis. Our empirical evaluation of CORAL across various datasets demonstrates its effectiveness in handling the complexities of concept drift. This approach introduces a novel perspective in the theoretical domain of co-evolving time series analysis, enhancing adaptability and accuracy in the face of dynamic data environments, and can be easily integrated into most deep learning backbones.
Related papers
- In-Context Linear Regression Demystified: Training Dynamics and Mechanistic Interpretability of Multi-Head Softmax Attention [52.159541540613915]
We study how multi-head softmax attention models are trained to perform in-context learning on linear data.
Our results reveal that in-context learning ability emerges from the trained transformer as an aggregated effect of its architecture and the underlying data distribution.
arXiv Detail & Related papers (2025-03-17T02:00:49Z) - Community-Aware Temporal Walks: Parameter-Free Representation Learning on Continuous-Time Dynamic Graphs [3.833708891059351]
Community-aware Temporal Walks (CTWalks) is a novel framework for representation learning on continuous-time dynamic graphs.
CTWalks integrates a community-based parameter-free temporal walk sampling mechanism, an anonymization strategy enriched with community labels, and an encoding process.
Experiments on benchmark datasets demonstrate that CTWalks outperforms established methods in temporal link prediction tasks.
arXiv Detail & Related papers (2025-01-21T04:16:46Z) - Conservation-informed Graph Learning for Spatiotemporal Dynamics Prediction [84.26340606752763]
In this paper, we introduce the conservation-informed GNN (CiGNN), an end-to-end explainable learning framework.
The network is designed to conform to the general symmetry conservation law via symmetry where conservative and non-conservative information passes over a multiscale space by a latent temporal marching strategy.
Results demonstrate that CiGNN exhibits remarkable baseline accuracy and generalizability, and is readily applicable to learning for prediction of varioustemporal dynamics.
arXiv Detail & Related papers (2024-12-30T13:55:59Z) - WormKAN: Are KAN Effective for Identifying and Tracking Concept Drift in Time Series? [6.4314326272535896]
WormKAN is a concept-aware KAN-based model to address concept drift in co-evolving time series.<n>WormKAN consists of three key components: Patch Normalization, Temporal Representation Module, and Concept Dynamics.
arXiv Detail & Related papers (2024-10-13T23:05:37Z) - Deep End-to-End Survival Analysis with Temporal Consistency [49.77103348208835]
We present a novel Survival Analysis algorithm designed to efficiently handle large-scale longitudinal data.
A central idea in our method is temporal consistency, a hypothesis that past and future outcomes in the data evolve smoothly over time.
Our framework uniquely incorporates temporal consistency into large datasets by providing a stable training signal.
arXiv Detail & Related papers (2024-10-09T11:37:09Z) - Wormhole: Concept-Aware Deep Representation Learning for Co-Evolving Sequences [6.4314326272535896]
This paper introduces Wormhole, a novel deep representation learning framework that is concept-aware and designed for co-evolving time sequences.
concept transitions are detected by identifying abrupt changes in the latent space, signifying a shift to new behavior.
This novel mechanism accurately discerns concepts within co-evolving sequences and pinpoints the exact locations of these wormholes.
arXiv Detail & Related papers (2024-09-20T19:11:39Z) - Decoupled Marked Temporal Point Process using Neural Ordinary Differential Equations [14.828081841581296]
A Marked Temporal Point Process (MTPP) is a process whose realization is a set of event-time data.
Recent studies have utilized deep neural networks to capture complex temporal dependencies of events.
We propose a Decoupled MTPP framework that disentangles characterization of a process into a set of evolving influences from different events.
arXiv Detail & Related papers (2024-06-10T10:15:32Z) - EulerFormer: Sequential User Behavior Modeling with Complex Vector Attention [88.45459681677369]
We propose a novel transformer variant with complex vector attention, named EulerFormer.
It provides a unified theoretical framework to formulate both semantic difference and positional difference.
It is more robust to semantic variations and possesses moresuperior theoretical properties in principle.
arXiv Detail & Related papers (2024-03-26T14:18:43Z) - A Poisson-Gamma Dynamic Factor Model with Time-Varying Transition Dynamics [51.147876395589925]
A non-stationary PGDS is proposed to allow the underlying transition matrices to evolve over time.
A fully-conjugate and efficient Gibbs sampler is developed to perform posterior simulation.
Experiments show that, in comparison with related models, the proposed non-stationary PGDS achieves improved predictive performance.
arXiv Detail & Related papers (2024-02-26T04:39:01Z) - Attractor Memory for Long-Term Time Series Forecasting: A Chaos Perspective [63.60312929416228]
textbftextitAttraos incorporates chaos theory into long-term time series forecasting.
We show that Attraos outperforms various LTSF methods on mainstream datasets and chaotic datasets with only one-twelfth of the parameters compared to PatchTST.
arXiv Detail & Related papers (2024-02-18T05:35:01Z) - Cumulative Distribution Function based General Temporal Point Processes [49.758080415846884]
CuFun model represents a novel approach to TPPs that revolves around the Cumulative Distribution Function (CDF)
Our approach addresses several critical issues inherent in traditional TPP modeling.
Our contributions encompass the introduction of a pioneering CDF-based TPP model, the development of a methodology for incorporating past event information into future event prediction.
arXiv Detail & Related papers (2024-02-01T07:21:30Z) - Rethinking Urban Mobility Prediction: A Super-Multivariate Time Series
Forecasting Approach [71.67506068703314]
Long-term urban mobility predictions play a crucial role in the effective management of urban facilities and services.
Traditionally, urban mobility data has been structured as videos, treating longitude and latitude as fundamental pixels.
In our research, we introduce a fresh perspective on urban mobility prediction.
Instead of oversimplifying urban mobility data as traditional video data, we regard it as a complex time series.
arXiv Detail & Related papers (2023-12-04T07:39:05Z) - CODA: Temporal Domain Generalization via Concept Drift Simulator [34.21255368783787]
In real-world applications, machine learning models often become obsolete due to shifts in the joint distribution arising from underlying temporal trends.
We propose the COncept Drift simulAtor framework incorporating a predicted feature correlation matrix to simulate future data for model training.
arXiv Detail & Related papers (2023-10-02T18:04:34Z) - Learning the Evolutionary and Multi-scale Graph Structure for
Multivariate Time Series Forecasting [50.901984244738806]
We show how to model the evolutionary and multi-scale interactions of time series.
In particular, we first provide a hierarchical graph structure cooperated with the dilated convolution to capture the scale-specific correlations.
A unified neural network is provided to integrate the components above to get the final prediction.
arXiv Detail & Related papers (2022-06-28T08:11:12Z) - Temporal Domain Generalization with Drift-Aware Dynamic Neural Network [12.483886657900525]
We propose a Temporal Domain Generalization with Drift-Aware Dynamic Neural Network (DRAIN) framework.
Specifically, we formulate the problem into a Bayesian framework that jointly models the relation between data and model dynamics.
It captures the temporal drift of model parameters and data distributions and can predict models in the future without the presence of future data.
arXiv Detail & Related papers (2022-05-21T20:01:31Z) - Drift Estimation with Graphical Models [0.0]
We make use of graphical models to elicit the visible structure of the data and we infer from there changes in the hidden context.
The paper evaluate the method with real world data on the Australian Electric market.
arXiv Detail & Related papers (2021-02-02T12:24:34Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.