Improving Estimation of the Koopman Operator with Kolmogorov-Smirnov
Indicator Functions
- URL: http://arxiv.org/abs/2306.05945v1
- Date: Fri, 9 Jun 2023 15:01:43 GMT
- Title: Improving Estimation of the Koopman Operator with Kolmogorov-Smirnov
Indicator Functions
- Authors: Van A. Ngo, Yen Ting Lin, Danny Perez
- Abstract summary: Key to a practical success of the approach is the identification of a set of observables which form a good basis in which to expand the slow relaxation modes.
We propose a simple and computationally efficient clustering procedure to infer surrogate observables that form a good basis for slow modes.
We consistently demonstrate that the inferred indicator functions can significantly improve the estimation of the leading eigenvalues of the Koopman operators.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: It has become common to perform kinetic analysis using approximate Koopman
operators that transforms high-dimensional time series of observables into
ranked dynamical modes. Key to a practical success of the approach is the
identification of a set of observables which form a good basis in which to
expand the slow relaxation modes. Good observables are, however, difficult to
identify {\em a priori} and sub-optimal choices can lead to significant
underestimations of characteristic timescales. Leveraging the representation of
slow dynamics in terms of Hidden Markov Model (HMM), we propose a simple and
computationally efficient clustering procedure to infer surrogate observables
that form a good basis for slow modes. We apply the approach to an analytically
solvable model system, as well as on three protein systems of different
complexities. We consistently demonstrate that the inferred indicator functions
can significantly improve the estimation of the leading eigenvalues of the
Koopman operators and correctly identify key states and transition timescales
of stochastic systems, even when good observables are not known {\em a priori}.
Related papers
- A Poisson-Gamma Dynamic Factor Model with Time-Varying Transition Dynamics [51.147876395589925]
A non-stationary PGDS is proposed to allow the underlying transition matrices to evolve over time.
A fully-conjugate and efficient Gibbs sampler is developed to perform posterior simulation.
Experiments show that, in comparison with related models, the proposed non-stationary PGDS achieves improved predictive performance.
arXiv Detail & Related papers (2024-02-26T04:39:01Z) - Enhancing Predictive Capabilities in Data-Driven Dynamical Modeling with Automatic Differentiation: Koopman and Neural ODE Approaches [0.0]
Data-driven approximations of the Koopman operator are promising for predicting the time evolution of systems characterized by complex dynamics.
Here we present a modification of EDMD-DL that concurrently determines both the dictionary of observables and the corresponding approximation of the Koopman operator.
arXiv Detail & Related papers (2023-10-10T17:04:21Z) - Learning invariant representations of time-homogeneous stochastic dynamical systems [27.127773672738535]
We study the problem of learning a representation of the state that faithfully captures its dynamics.
This is instrumental to learning the transfer operator or the generator of the system.
We show that the search for a good representation can be cast as an optimization problem over neural networks.
arXiv Detail & Related papers (2023-07-19T11:32:24Z) - Retrieval of Boost Invariant Symbolic Observables via Feature Importance [0.0]
Deep learning approaches for jet tagging in high-energy physics are characterized as black boxes that process a large amount of information from which it is difficult to extract key distinctive observables.
We present an alternative to deep learning approaches, Boost Invariant Polynomials, which enables direct analysis of simple analytic expressions representing the most important features in a given task.
arXiv Detail & Related papers (2023-06-23T13:41:06Z) - ChiroDiff: Modelling chirographic data with Diffusion Models [132.5223191478268]
We introduce a powerful model-class namely "Denoising Diffusion Probabilistic Models" or DDPMs for chirographic data.
Our model named "ChiroDiff", being non-autoregressive, learns to capture holistic concepts and therefore remains resilient to higher temporal sampling rate.
arXiv Detail & Related papers (2023-04-07T15:17:48Z) - Neural Superstatistics for Bayesian Estimation of Dynamic Cognitive
Models [2.7391842773173334]
We develop a simulation-based deep learning method for Bayesian inference, which can recover both time-varying and time-invariant parameters.
Our results show that the deep learning approach is very efficient in capturing the temporal dynamics of the model.
arXiv Detail & Related papers (2022-11-23T17:42:53Z) - HyperImpute: Generalized Iterative Imputation with Automatic Model
Selection [77.86861638371926]
We propose a generalized iterative imputation framework for adaptively and automatically configuring column-wise models.
We provide a concrete implementation with out-of-the-box learners, simulators, and interfaces.
arXiv Detail & Related papers (2022-06-15T19:10:35Z) - Identification and Adaptation with Binary-Valued Observations under
Non-Persistent Excitation Condition [1.6897716547971817]
We propose an online projected Quasi-Newton type algorithm for estimation of parameter estimation of regression models with binary-valued observations.
We establish the strong consistency of the estimation algorithm and provide the convergence rate.
Convergence of adaptive predictors and their applications in adaptive control are also discussed.
arXiv Detail & Related papers (2021-07-08T03:57:50Z) - On Contrastive Representations of Stochastic Processes [53.21653429290478]
Learning representations of processes is an emerging problem in machine learning.
We show that our methods are effective for learning representations of periodic functions, 3D objects and dynamical processes.
arXiv Detail & Related papers (2021-06-18T11:00:24Z) - Bayesian Attention Modules [65.52970388117923]
We propose a scalable version of attention that is easy to implement and optimize.
Our experiments show the proposed method brings consistent improvements over the corresponding baselines.
arXiv Detail & Related papers (2020-10-20T20:30:55Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.