Temporal Forward-Backward Consistency, Not Residual Error, Measures the
Prediction Accuracy of Extended Dynamic Mode Decomposition
- URL: http://arxiv.org/abs/2207.07719v1
- Date: Fri, 15 Jul 2022 19:22:22 GMT
- Title: Temporal Forward-Backward Consistency, Not Residual Error, Measures the
Prediction Accuracy of Extended Dynamic Mode Decomposition
- Authors: Masih Haseli, Jorge Cort\'es
- Abstract summary: Extended Dynamic Mode Decomposition (EDMD) is a method to approximate the action of the Koopman operator on a linear function space spanned by a dictionary of functions.
We introduce the novel concept of consistency index.
We show that this measure, based on using EDMD forward and backward in time, enjoys a number of desirable qualities.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Extended Dynamic Mode Decomposition (EDMD) is a popular data-driven method to
approximate the action of the Koopman operator on a linear function space
spanned by a dictionary of functions. The accuracy of EDMD model critically
depends on the quality of the particular dictionary's span, specifically on how
close it is to being invariant under the Koopman operator. Motivated by the
observation that the residual error of EDMD, typically used for dictionary
learning, does not encode the quality of the function space and is sensitive to
the choice of basis, we introduce the novel concept of consistency index. We
show that this measure, based on using EDMD forward and backward in time,
enjoys a number of desirable qualities that make it suitable for data-driven
modeling of dynamical systems: it measures the quality of the function space,
it is invariant under the choice of basis, can be computed in closed form from
the data, and provides a tight upper-bound for the relative root mean square
error of all function predictions on the entire span of the dictionary.
Related papers
- Learning dynamical systems from data: Gradient-based dictionary optimization [0.8643517734716606]
We present a novel gradient descent-based optimization framework for learning suitable basis functions from data.
We show how it can be used in combination with EDMD, SINDy, and PDE-FIND.
arXiv Detail & Related papers (2024-11-07T15:15:27Z) - Breaking Determinism: Fuzzy Modeling of Sequential Recommendation Using Discrete State Space Diffusion Model [66.91323540178739]
Sequential recommendation (SR) aims to predict items that users may be interested in based on their historical behavior.
We revisit SR from a novel information-theoretic perspective and find that sequential modeling methods fail to adequately capture randomness and unpredictability of user behavior.
Inspired by fuzzy information processing theory, this paper introduces the fuzzy sets of interaction sequences to overcome the limitations and better capture the evolution of users' real interests.
arXiv Detail & Related papers (2024-10-31T14:52:01Z) - Learning Invariant Subspaces of Koopman Operators--Part 1: A Methodology
for Demonstrating a Dictionary's Approximate Subspace Invariance [0.0]
In a widely used algorithm, Extended Dynamic Mode Decomposition, the dictionary functions are drawn from a fixed class of functions.
Deep learning combined with EDMD has been used to learn novel dictionary functions in an algorithm called deep dynamic mode decomposition (deepDMD)
In this paper we analyze the learned dictionaries from deepDMD and explore the theoretical basis for their strong performance.
arXiv Detail & Related papers (2022-12-14T17:33:52Z) - Adaptive LASSO estimation for functional hidden dynamic geostatistical
model [69.10717733870575]
We propose a novel model selection algorithm based on a penalized maximum likelihood estimator (PMLE) for functional hiddenstatistical models (f-HD)
The algorithm is based on iterative optimisation and uses an adaptive least absolute shrinkage and selector operator (GMSOLAS) penalty function, wherein the weights are obtained by the unpenalised f-HD maximum-likelihood estimators.
arXiv Detail & Related papers (2022-08-10T19:17:45Z) - Heterogeneous mixtures of dictionary functions to approximate subspace
invariance in Koopman operators [0.0]
Deep learning combined with EDMD has been used to learn novel dictionary functions in an algorithm called deep dynamic mode decomposition (deepDMD)
We discover a novel class of dictionary functions to approximate Koopman observables.
Our results provide a hypothesis to explain the success of deep neural networks in learning numerical approximations to Koopman operators.
arXiv Detail & Related papers (2022-06-27T19:04:03Z) - ER: Equivariance Regularizer for Knowledge Graph Completion [107.51609402963072]
We propose a new regularizer, namely, Equivariance Regularizer (ER)
ER can enhance the generalization ability of the model by employing the semantic equivariance between the head and tail entities.
The experimental results indicate a clear and substantial improvement over the state-of-the-art relation prediction methods.
arXiv Detail & Related papers (2022-06-24T08:18:05Z) - Extension of Dynamic Mode Decomposition for dynamic systems with
incomplete information based on t-model of optimal prediction [69.81996031777717]
The Dynamic Mode Decomposition has proved to be a very efficient technique to study dynamic data.
The application of this approach becomes problematic if the available data is incomplete because some dimensions of smaller scale either missing or unmeasured.
We consider a first-order approximation of the Mori-Zwanzig decomposition, state the corresponding optimization problem and solve it with the gradient-based optimization method.
arXiv Detail & Related papers (2022-02-23T11:23:59Z) - Meta Learning Low Rank Covariance Factors for Energy-Based Deterministic
Uncertainty [58.144520501201995]
Bi-Lipschitz regularization of neural network layers preserve relative distances between data instances in the feature spaces of each layer.
With the use of an attentive set encoder, we propose to meta learn either diagonal or diagonal plus low-rank factors to efficiently construct task specific covariance matrices.
We also propose an inference procedure which utilizes scaled energy to achieve a final predictive distribution.
arXiv Detail & Related papers (2021-10-12T22:04:19Z) - Deep Identification of Nonlinear Systems in Koopman Form [0.0]
The present paper treats the identification of nonlinear dynamical systems using Koopman-based deep state-space encoders.
An input-affine formulation is considered for the lifted model structure and we address both full and partial state availability.
arXiv Detail & Related papers (2021-10-06T08:50:56Z) - Generalizing Dynamic Mode Decomposition: Balancing Accuracy and
Expressiveness in Koopman Approximations [0.0]
This paper tackles the data-driven approximation of unknown dynamical systems using Koopman-operator methods.
We propose the Tunable Symmetric Subspace Decomposition algorithm to refine the dictionary.
We provide a full characterization of the algorithm properties and show that it generalizes both Extended Dynamic Mode Decomposition and Symmetric Subspace Decomposition.
arXiv Detail & Related papers (2021-08-08T19:11:41Z) - Supervised Learning for Non-Sequential Data: A Canonical Polyadic
Decomposition Approach [85.12934750565971]
Efficient modelling of feature interactions underpins supervised learning for non-sequential tasks.
To alleviate this issue, it has been proposed to implicitly represent the model parameters as a tensor.
For enhanced expressiveness, we generalize the framework to allow feature mapping to arbitrarily high-dimensional feature vectors.
arXiv Detail & Related papers (2020-01-27T22:38:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.