DAO-GP Drift Aware Online Non-Linear Regression Gaussian-Process
- URL: http://arxiv.org/abs/2512.08879v1
- Date: Tue, 09 Dec 2025 18:12:38 GMT
- Title: DAO-GP Drift Aware Online Non-Linear Regression Gaussian-Process
- Authors: Mohammad Abu-Shaira, Ajita Rattani, Weishi Shi,
- Abstract summary: Real-world datasets often exhibit temporal dynamics characterized by evolving data distributions.<n>Disregarding this phenomenon, commonly referred to as concept drift, can significantly diminish a model's predictive accuracy.<n>Drift-Aware Online Process (GP) is a novel, fully adaptive, hyper-free, and sparse non-linear regression model.
- Score: 8.665001359628592
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Real-world datasets often exhibit temporal dynamics characterized by evolving data distributions. Disregarding this phenomenon, commonly referred to as concept drift, can significantly diminish a model's predictive accuracy. Furthermore, the presence of hyperparameters in online models exacerbates this issue. These parameters are typically fixed and cannot be dynamically adjusted by the user in response to the evolving data distribution. Gaussian Process (GP) models offer powerful non-parametric regression capabilities with uncertainty quantification, making them ideal for modeling complex data relationships in an online setting. However, conventional online GP methods face several critical limitations, including a lack of drift-awareness, reliance on fixed hyperparameters, vulnerability to data snooping, absence of a principled decay mechanism, and memory inefficiencies. In response, we propose DAO-GP (Drift-Aware Online Gaussian Process), a novel, fully adaptive, hyperparameter-free, decayed, and sparse non-linear regression model. DAO-GP features a built-in drift detection and adaptation mechanism that dynamically adjusts model behavior based on the severity of drift. Extensive empirical evaluations confirm DAO-GP's robustness across stationary conditions, diverse drift types (abrupt, incremental, gradual), and varied data characteristics. Analyses demonstrate its dynamic adaptation, efficient in-memory and decay-based management, and evolving inducing points. Compared with state-of-the-art parametric and non-parametric models, DAO-GP consistently achieves superior or competitive performance, establishing it as a drift-resilient solution for online non-linear regression.
Related papers
- Is Flow Matching Just Trajectory Replay for Sequential Data? [46.770624059457724]
Flow matching (FM) is increasingly used for time-series generation.<n>It is not well understood whether it learns a general dynamical structure or simply performs an effective "trajectory replay"<n>We show that the implied sampler is an ODE whose dynamics constitutes a nonparametric, memory-augmented continuous-time dynamical system.
arXiv Detail & Related papers (2026-02-09T06:48:45Z) - OLC-WA: Drift Aware Tuning-Free Online Classification with Weighted Average [18.902790013938006]
This paper introduces Online Classification with weighted average (OLC-WA)<n>OLC-WA operates by blending incoming data streams with an existing base model.<n>An integrated optimization mechanism dynamically detects concept drift, quantifies its magnitude, and adjusts the model.
arXiv Detail & Related papers (2025-12-14T17:52:39Z) - OLR-WAA: Adaptive and Drift-Resilient Online Regression with Dynamic Weighted Averaging [7.146027549101716]
"OLR-WAA: An Adaptive and Drift-Resilient Online Regression with Dynamic weighted average"<n>This paper introduces "OLR-WAA: An Adaptive and Drift-Resilient Online Regression with Dynamic weighted average"
arXiv Detail & Related papers (2025-12-14T17:39:51Z) - Mind the Jumps: A Scalable Robust Local Gaussian Process for Multidimensional Response Surfaces with Discontinuities [1.1458853556386799]
Robust Local Gaussian Process is a framework that integrates adaptive nearest-neighbor selection with a sparsity-driven robustification mechanism.<n>It consistently delivers high predictive accuracy and maintains competitive computational efficiency.<n>These results establish RLGP as an effective and practical solution for modeling nonstationary and discontinuous response surfaces.
arXiv Detail & Related papers (2025-12-14T06:52:17Z) - Data-Driven Modeling and Correction of Vehicle Dynamics [36.247839904691105]
We develop a data-driven framework for learning and correcting non-autonomous vehicle dynamics.<n>For more strongly nonlinear systems, we employFlow Map Learning, a deep neural network approach.
arXiv Detail & Related papers (2025-11-29T03:04:28Z) - Efficient Transformed Gaussian Process State-Space Models for Non-Stationary High-Dimensional Dynamical Systems [49.819436680336786]
We propose an efficient transformed Gaussian process state-space model (ETGPSSM) for scalable and flexible modeling of high-dimensional, non-stationary dynamical systems.<n>Specifically, our ETGPSSM integrates a single shared GP with input-dependent normalizing flows, yielding an expressive implicit process prior that captures complex, non-stationary transition dynamics.<n>Our ETGPSSM outperforms existing GPSSMs and neural network-based SSMs in terms of computational efficiency and accuracy.
arXiv Detail & Related papers (2025-03-24T03:19:45Z) - datadriftR: An R Package for Concept Drift Detection in Predictive Models [0.0]
This paper introduces drifter, an R package designed to detect concept drift.<n>It proposes a novel method called Profile Drift Detection (PDD) that enables both drift detection and an enhanced understanding of the cause behind the drift.
arXiv Detail & Related papers (2024-12-15T20:59:49Z) - Ensemble Kalman Filtering Meets Gaussian Process SSM for Non-Mean-Field and Online Inference [47.460898983429374]
We introduce an ensemble Kalman filter (EnKF) into the non-mean-field (NMF) variational inference framework to approximate the posterior distribution of the latent states.
This novel marriage between EnKF and GPSSM not only eliminates the need for extensive parameterization in learning variational distributions, but also enables an interpretable, closed-form approximation of the evidence lower bound (ELBO)
We demonstrate that the resulting EnKF-aided online algorithm embodies a principled objective function by ensuring data-fitting accuracy while incorporating model regularizations to mitigate overfitting.
arXiv Detail & Related papers (2023-12-10T15:22:30Z) - Kalman Filter for Online Classification of Non-Stationary Data [101.26838049872651]
In Online Continual Learning (OCL) a learning system receives a stream of data and sequentially performs prediction and training steps.
We introduce a probabilistic Bayesian online learning model by using a neural representation and a state space model over the linear predictor weights.
In experiments in multi-class classification we demonstrate the predictive ability of the model and its flexibility to capture non-stationarity.
arXiv Detail & Related papers (2023-06-14T11:41:42Z) - Incremental Ensemble Gaussian Processes [53.3291389385672]
We propose an incremental ensemble (IE-) GP framework, where an EGP meta-learner employs an it ensemble of GP learners, each having a unique kernel belonging to a prescribed kernel dictionary.
With each GP expert leveraging the random feature-based approximation to perform online prediction and model update with it scalability, the EGP meta-learner capitalizes on data-adaptive weights to synthesize the per-expert predictions.
The novel IE-GP is generalized to accommodate time-varying functions by modeling structured dynamics at the EGP meta-learner and within each GP learner.
arXiv Detail & Related papers (2021-10-13T15:11:25Z) - Deep Gaussian Processes for Biogeophysical Parameter Retrieval and Model
Inversion [14.097477944789484]
This paper introduces the use of deep Gaussian Processes (DGPs) for bio-geo-physical model inversion.
Unlike shallow GP models, DGPs account for complicated (modular, hierarchical) processes, provide an efficient solution that scales well to big datasets.
arXiv Detail & Related papers (2021-04-16T10:42:01Z) - A Hypergradient Approach to Robust Regression without Correspondence [85.49775273716503]
We consider a variant of regression problem, where the correspondence between input and output data is not available.
Most existing methods are only applicable when the sample size is small.
We propose a new computational framework -- ROBOT -- for the shuffled regression problem.
arXiv Detail & Related papers (2020-11-30T21:47:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.