Monotonic Neural Ordinary Differential Equation: Time-series Forecasting
for Cumulative Data
- URL: http://arxiv.org/abs/2309.13452v1
- Date: Sat, 23 Sep 2023 18:40:10 GMT
- Title: Monotonic Neural Ordinary Differential Equation: Time-series Forecasting
for Cumulative Data
- Authors: Zhichao Chen, Leilei Ding, Zhixuan Chu, Yucheng Qi, Jianmin Huang, Hao
Wang
- Abstract summary: We propose a principled approach called Monotonic neural Ordinary Differential Equation (MODE) within the framework of neural ordinary differential equations.
By leveraging MODE, we are able to effectively capture and represent the monotonicity and irregularity in practical cumulative data.
We demonstrate that MODE outperforms state-of-the-art methods, showcasing its ability to handle both monotonicity and irregularity in cumulative data.
- Score: 9.03818193356305
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time-Series Forecasting based on Cumulative Data (TSFCD) is a crucial problem
in decision-making across various industrial scenarios. However, existing
time-series forecasting methods often overlook two important characteristics of
cumulative data, namely monotonicity and irregularity, which limit their
practical applicability. To address this limitation, we propose a principled
approach called Monotonic neural Ordinary Differential Equation (MODE) within
the framework of neural ordinary differential equations. By leveraging MODE, we
are able to effectively capture and represent the monotonicity and irregularity
in practical cumulative data. Through extensive experiments conducted in a
bonus allocation scenario, we demonstrate that MODE outperforms
state-of-the-art methods, showcasing its ability to handle both monotonicity
and irregularity in cumulative data and delivering superior forecasting
performance.
Related papers
- Invertible Solution of Neural Differential Equations for Analysis of
Irregularly-Sampled Time Series [4.14360329494344]
We propose an invertible solution of Neural Differential Equations (NDE)-based method to handle the complexities of irregular and incomplete time series data.
Our method suggests the variation of Neural Controlled Differential Equations (Neural CDEs) with Neural Flow, which ensures invertibility while maintaining a lower computational burden.
At the core of our approach is an enhanced dual latent states architecture, carefully designed for high precision across various time series tasks.
arXiv Detail & Related papers (2024-01-10T07:51:02Z) - Online Variational Sequential Monte Carlo [49.97673761305336]
We build upon the variational sequential Monte Carlo (VSMC) method, which provides computationally efficient and accurate model parameter estimation and Bayesian latent-state inference.
Online VSMC is capable of performing efficiently, entirely on-the-fly, both parameter estimation and particle proposal adaptation.
arXiv Detail & Related papers (2023-12-19T21:45:38Z) - Data-driven Modeling and Inference for Bayesian Gaussian Process ODEs
via Double Normalizing Flows [28.62579476863723]
We introduce normalizing flows to re parameterize the ODE vector field, resulting in a data-driven prior distribution.
We also apply normalizing flows to the posterior inference of GP ODEs to resolve the issue of strong mean-field assumptions.
We validate the effectiveness of our approach on simulated dynamical systems and real-world human motion data.
arXiv Detail & Related papers (2023-09-17T09:28:47Z) - Predicting Ordinary Differential Equations with Transformers [65.07437364102931]
We develop a transformer-based sequence-to-sequence model that recovers scalar ordinary differential equations (ODEs) in symbolic form from irregularly sampled and noisy observations of a single solution trajectory.
Our method is efficiently scalable: after one-time pretraining on a large set of ODEs, we can infer the governing law of a new observed solution in a few forward passes of the model.
arXiv Detail & Related papers (2023-07-24T08:46:12Z) - Continuous-Time Modeling of Counterfactual Outcomes Using Neural
Controlled Differential Equations [84.42837346400151]
Estimating counterfactual outcomes over time has the potential to unlock personalized healthcare.
Existing causal inference approaches consider regular, discrete-time intervals between observations and treatment decisions.
We propose a controllable simulation environment based on a model of tumor growth for a range of scenarios.
arXiv Detail & Related papers (2022-06-16T17:15:15Z) - The Variational Method of Moments [65.91730154730905]
conditional moment problem is a powerful formulation for describing structural causal parameters in terms of observables.
Motivated by a variational minimax reformulation of OWGMM, we define a very general class of estimators for the conditional moment problem.
We provide algorithms for valid statistical inference based on the same kind of variational reformulations.
arXiv Detail & Related papers (2020-12-17T07:21:06Z) - Accurate Characterization of Non-Uniformly Sampled Time Series using
Stochastic Differential Equations [0.0]
Non-uniform sampling arises when an experimenter does not have full control over the sampling characteristics of the process under investigation.
We introduce new initial estimates for the numerical optimization of the likelihood.
We show the increased accuracy achieved the new estimator in simulation experiments.
arXiv Detail & Related papers (2020-07-02T13:03:09Z) - STEER: Simple Temporal Regularization For Neural ODEs [80.80350769936383]
We propose a new regularization technique: randomly sampling the end time of the ODE during training.
The proposed regularization is simple to implement, has negligible overhead and is effective across a wide variety of tasks.
We show through experiments on normalizing flows, time series models and image recognition that the proposed regularization can significantly decrease training time and even improve performance over baseline models.
arXiv Detail & Related papers (2020-06-18T17:44:50Z) - Neural Controlled Differential Equations for Irregular Time Series [17.338923885534197]
An ordinary differential equation is determined by its initial condition, and there is no mechanism for adjusting the trajectory based on subsequent observations.
Here we demonstrate how this may be resolved through the well-understood mathematics of emphcontrolled differential equations
We show that our model achieves state-of-the-art performance against similar (ODE or RNN based) models in empirical studies on a range of datasets.
arXiv Detail & Related papers (2020-05-18T17:52:21Z) - A Multi-Channel Neural Graphical Event Model with Negative Evidence [76.51278722190607]
Event datasets are sequences of events of various types occurring irregularly over the time-line.
We propose a non-parametric deep neural network approach in order to estimate the underlying intensity functions.
arXiv Detail & Related papers (2020-02-21T23:10:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.