Efficient Interpretable Nonlinear Modeling for Multiple Time Series
- URL: http://arxiv.org/abs/2309.17154v1
- Date: Fri, 29 Sep 2023 11:42:59 GMT
- Title: Efficient Interpretable Nonlinear Modeling for Multiple Time Series
- Authors: Kevin Roy, Luis Miguel Lopez-Ramos and Baltasar Beferull-Lozano
- Abstract summary: This paper proposes an efficient nonlinear modeling approach for multiple time series.
It incorporates nonlinear interactions among different time-series variables.
Experimental results show that the proposed algorithm improves the identification of the support of the VAR coefficients in a parsimonious manner.
- Score: 5.448070998907116
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Predictive linear and nonlinear models based on kernel machines or deep
neural networks have been used to discover dependencies among time series. This
paper proposes an efficient nonlinear modeling approach for multiple time
series, with a complexity comparable to linear vector autoregressive (VAR)
models while still incorporating nonlinear interactions among different
time-series variables. The modeling assumption is that the set of time series
is generated in two steps: first, a linear VAR process in a latent space, and
second, a set of invertible and Lipschitz continuous nonlinear mappings that
are applied per sensor, that is, a component-wise mapping from each latent
variable to a variable in the measurement space. The VAR coefficient
identification provides a topology representation of the dependencies among the
aforementioned variables. The proposed approach models each component-wise
nonlinearity using an invertible neural network and imposes sparsity on the VAR
coefficients to reflect the parsimonious dependencies usually found in real
applications. To efficiently solve the formulated optimization problems, a
custom algorithm is devised combining proximal gradient descent, stochastic
primal-dual updates, and projection to enforce the corresponding constraints.
Experimental results on both synthetic and real data sets show that the
proposed algorithm improves the identification of the support of the VAR
coefficients in a parsimonious manner while also improving the time-series
prediction, as compared to the current state-of-the-art methods.
Related papers
- Solving Inverse Problems with Model Mismatch using Untrained Neural Networks within Model-based Architectures [14.551812310439004]
We introduce an untrained forward model residual block within the model-based architecture to match the data consistency in the measurement domain for each instance.
Our approach offers a unified solution that is less parameter-sensitive, requires no additional data, and enables simultaneous fitting of the forward model and reconstruction in a single pass.
arXiv Detail & Related papers (2024-03-07T19:02:13Z) - Diffeomorphic Transformations for Time Series Analysis: An Efficient
Approach to Nonlinear Warping [0.0]
The proliferation and ubiquity of temporal data across many disciplines has sparked interest for similarity, classification and clustering methods.
Traditional distance measures such as the Euclidean are not well-suited due to the time-dependent nature of the data.
This thesis proposes novel elastic alignment methods that use parametric & diffeomorphic warping transformations.
arXiv Detail & Related papers (2023-09-25T10:51:47Z) - Data-driven Nonlinear Parametric Model Order Reduction Framework using
Deep Hierarchical Variational Autoencoder [5.521324490427243]
Data-driven parametric model order reduction (MOR) method using a deep artificial neural network is proposed.
LSH-VAE is capable of performing nonlinear MOR for the parametric of a nonlinear dynamic system with a significant number of degrees of freedom.
arXiv Detail & Related papers (2023-07-10T02:44:53Z) - An Interpretable and Efficient Infinite-Order Vector Autoregressive
Model for High-Dimensional Time Series [1.4939176102916187]
This paper proposes a novel sparse infinite-order VAR model for high-dimensional time series.
The temporal and cross-sectional structures of the VARMA-type dynamics captured by this model can be interpreted separately.
Greater statistical efficiency and interpretability can be achieved with little loss of temporal information.
arXiv Detail & Related papers (2022-09-02T17:14:24Z) - Non-linear manifold ROM with Convolutional Autoencoders and Reduced
Over-Collocation method [0.0]
Non-affine parametric dependencies, nonlinearities and advection-dominated regimes of the model of interest can result in a slow Kolmogorov n-width decay.
We implement the non-linear manifold method introduced by Carlberg et al [37] with hyper-reduction achieved through reduced over-collocation and teacher-student training of a reduced decoder.
We test the methodology on a 2d non-linear conservation law and a 2d shallow water models, and compare the results obtained with a purely data-driven method for which the dynamics is evolved in time with a long-short term memory network
arXiv Detail & Related papers (2022-03-01T11:16:50Z) - A Priori Denoising Strategies for Sparse Identification of Nonlinear
Dynamical Systems: A Comparative Study [68.8204255655161]
We investigate and compare the performance of several local and global smoothing techniques to a priori denoise the state measurements.
We show that, in general, global methods, which use the entire measurement data set, outperform local methods, which employ a neighboring data subset around a local point.
arXiv Detail & Related papers (2022-01-29T23:31:25Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - DiffPD: Differentiable Projective Dynamics with Contact [65.88720481593118]
We present DiffPD, an efficient differentiable soft-body simulator with implicit time integration.
We evaluate the performance of DiffPD and observe a speedup of 4-19 times compared to the standard Newton's method in various applications.
arXiv Detail & Related papers (2021-01-15T00:13:33Z) - Estimation of Switched Markov Polynomial NARX models [75.91002178647165]
We identify a class of models for hybrid dynamical systems characterized by nonlinear autoregressive (NARX) components.
The proposed approach is demonstrated on a SMNARX problem composed by three nonlinear sub-models with specific regressors.
arXiv Detail & Related papers (2020-09-29T15:00:47Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.