Learning Differential Operators for Interpretable Time Series Modeling
- URL: http://arxiv.org/abs/2209.01491v1
- Date: Sat, 3 Sep 2022 20:14:31 GMT
- Title: Learning Differential Operators for Interpretable Time Series Modeling
- Authors: Yingtao Luo, Chang Xu, Yang Liu, Weiqing Liu, Shun Zheng and Jiang
Bian
- Abstract summary: We propose a learning framework that can automatically obtain interpretable PDE models from sequential data.
Our model can provide valuable interpretability and achieve comparable performance to state-of-the-art models.
- Score: 34.32259687441212
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Modeling sequential patterns from data is at the core of various time series
forecasting tasks. Deep learning models have greatly outperformed many
traditional models, but these black-box models generally lack explainability in
prediction and decision making. To reveal the underlying trend with
understandable mathematical expressions, scientists and economists tend to use
partial differential equations (PDEs) to explain the highly nonlinear dynamics
of sequential patterns. However, it usually requires domain expert knowledge
and a series of simplified assumptions, which is not always practical and can
deviate from the ever-changing world. Is it possible to learn the differential
relations from data dynamically to explain the time-evolving dynamics? In this
work, we propose an learning framework that can automatically obtain
interpretable PDE models from sequential data. Particularly, this framework is
comprised of learnable differential blocks, named $P$-blocks, which is proved
to be able to approximate any time-evolving complex continuous functions in
theory. Moreover, to capture the dynamics shift, this framework introduces a
meta-learning controller to dynamically optimize the hyper-parameters of a
hybrid PDE model. Extensive experiments on times series forecasting of
financial, engineering, and health data show that our model can provide
valuable interpretability and achieve comparable performance to
state-of-the-art models. From empirical studies, we find that learning a few
differential operators may capture the major trend of sequential dynamics
without massive computational complexity.
Related papers
- Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces a novel family of deep dynamical models designed to represent continuous-time sequence data.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experiments on oscillating systems, videos and real-world state sequences (MuJoCo) illustrate that ODEs with the learnable energy-based prior outperform existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - Generative learning for nonlinear dynamics [7.6146285961466]
generative machine learning models create realistic outputs far beyond their training data.
These successes suggest that generative models learn to effectively parametrize and sample arbitrarily complex distributions.
We aim to connect these classical works to emerging themes in large-scale generative statistical learning.
arXiv Detail & Related papers (2023-11-07T16:53:56Z) - Anamnesic Neural Differential Equations with Orthogonal Polynomial
Projections [6.345523830122166]
We propose PolyODE, a formulation that enforces long-range memory and preserves a global representation of the underlying dynamical system.
Our construction is backed by favourable theoretical guarantees and we demonstrate that it outperforms previous works in the reconstruction of past and future data.
arXiv Detail & Related papers (2023-03-03T10:49:09Z) - Learning PDE Solution Operator for Continuous Modeling of Time-Series [1.39661494747879]
This work presents a partial differential equation (PDE) based framework which improves the dynamics modeling capability.
We propose a neural operator that can handle time continuously without requiring iterative operations or specific grids of temporal discretization.
Our framework opens up a new way for a continuous representation of neural networks that can be readily adopted for real-world applications.
arXiv Detail & Related papers (2023-02-02T03:47:52Z) - Dynamic Latent Separation for Deep Learning [67.62190501599176]
A core problem in machine learning is to learn expressive latent variables for model prediction on complex data.
Here, we develop an approach that improves expressiveness, provides partial interpretation, and is not restricted to specific applications.
arXiv Detail & Related papers (2022-10-07T17:56:53Z) - Learning continuous models for continuous physics [94.42705784823997]
We develop a test based on numerical analysis theory to validate machine learning models for science and engineering applications.
Our results illustrate how principled numerical analysis methods can be coupled with existing ML training/testing methodologies to validate models for science and engineering applications.
arXiv Detail & Related papers (2022-02-17T07:56:46Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Generative Temporal Difference Learning for Infinite-Horizon Prediction [101.59882753763888]
We introduce the $gamma$-model, a predictive model of environment dynamics with an infinite probabilistic horizon.
We discuss how its training reflects an inescapable tradeoff between training-time and testing-time compounding errors.
arXiv Detail & Related papers (2020-10-27T17:54:12Z) - Variational Dynamic Mixtures [18.730501689781214]
We develop variational dynamic mixtures (VDM) to infer sequential latent variables.
In an empirical study, we show that VDM outperforms competing approaches on highly multi-modal datasets.
arXiv Detail & Related papers (2020-10-20T16:10:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.