Learning Invariant Subspaces of Koopman Operators--Part 2: Heterogeneous
Dictionary Mixing to Approximate Subspace Invariance
- URL: http://arxiv.org/abs/2212.07365v1
- Date: Wed, 14 Dec 2022 17:40:00 GMT
- Title: Learning Invariant Subspaces of Koopman Operators--Part 2: Heterogeneous
Dictionary Mixing to Approximate Subspace Invariance
- Authors: Charles A. Johnson, Shara Balakrishnan and Enoch Yeung
- Abstract summary: This work builds on the models and concepts presented in part 1 to learn approximate dictionary representations of Koopman operators from data.
We show that structured mixing of heterogeneous dictionary functions achieve the same accuracy and dimensional scaling as the deep-learning-based deepDMD algorithm.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This work builds on the models and concepts presented in part 1 to learn
approximate dictionary representations of Koopman operators from data. Part I
of this paper presented a methodology for arguing the subspace invariance of a
Koopman dictionary. This methodology was demonstrated on the state-inclusive
logistic lifting (SILL) basis. This is an affine basis augmented with
conjunctive logistic functions. The SILL dictionary's nonlinear functions are
homogeneous, a norm in data-driven dictionary learning of Koopman operators. In
this paper, we discover that structured mixing of heterogeneous dictionary
functions drawn from different classes of nonlinear functions achieve the same
accuracy and dimensional scaling as the deep-learning-based deepDMD algorithm.
We specifically show this by building a heterogeneous dictionary comprised of
SILL functions and conjunctive radial basis functions (RBFs). This mixed
dictionary achieves the same accuracy and dimensional scaling as deepDMD with
an order of magnitude reduction in parameters, while maintaining geometric
interpretability. These results strengthen the viability of dictionary-based
Koopman models to solving high-dimensional nonlinear learning problems.
Related papers
- Symmetry Discovery for Different Data Types [52.2614860099811]
Equivariant neural networks incorporate symmetries into their architecture, achieving higher generalization performance.
We propose LieSD, a method for discovering symmetries via trained neural networks which approximate the input-output mappings of the tasks.
We validate the performance of LieSD on tasks with symmetries such as the two-body problem, the moment of inertia matrix prediction, and top quark tagging.
arXiv Detail & Related papers (2024-10-13T13:39:39Z) - A Lightweight Randomized Nonlinear Dictionary Learning Method using Random Vector Functional Link [0.6138671548064356]
This paper presents an SVD-free lightweight approach to learning a nonlinear dictionary using a randomized functional link called a Random Vector Functional Link (RVFL)
The proposed RVFL-based nonlinear Dictionary Learning (RVFLDL) learns a dictionary as a sparse-to-dense feature map from nonlinear sparse coefficients to the dense input features.
The empirical evidence of the method illustrated in image classification and reconstruction applications shows that RVFLDL is scalable and provides a solution better than those obtained using other nonlinear dictionary learning methods.
arXiv Detail & Related papers (2024-02-06T09:24:53Z) - Dictionary Learning under Symmetries via Group Representations [1.304892050913381]
We study the problem of learning a dictionary that is invariant under a pre-specified group of transformations.
We apply our paradigm to investigate the dictionary learning problem for the groups SO(2) and SO(3).
arXiv Detail & Related papers (2023-05-31T04:54:06Z) - Learning Invariant Subspaces of Koopman Operators--Part 1: A Methodology
for Demonstrating a Dictionary's Approximate Subspace Invariance [0.0]
In a widely used algorithm, Extended Dynamic Mode Decomposition, the dictionary functions are drawn from a fixed class of functions.
Deep learning combined with EDMD has been used to learn novel dictionary functions in an algorithm called deep dynamic mode decomposition (deepDMD)
In this paper we analyze the learned dictionaries from deepDMD and explore the theoretical basis for their strong performance.
arXiv Detail & Related papers (2022-12-14T17:33:52Z) - Hierarchical Phrase-based Sequence-to-Sequence Learning [94.10257313923478]
We describe a neural transducer that maintains the flexibility of standard sequence-to-sequence (seq2seq) models while incorporating hierarchical phrases as a source of inductive bias during training and as explicit constraints during inference.
Our approach trains two models: a discriminative derivation based on a bracketing grammar whose tree hierarchically aligns source and target phrases, and a neural seq2seq model that learns to translate the aligned phrases one-by-one.
arXiv Detail & Related papers (2022-11-15T05:22:40Z) - Equivariance with Learned Canonicalization Functions [77.32483958400282]
We show that learning a small neural network to perform canonicalization is better than using predefineds.
Our experiments show that learning the canonicalization function is competitive with existing techniques for learning equivariant functions across many tasks.
arXiv Detail & Related papers (2022-11-11T21:58:15Z) - Heterogeneous mixtures of dictionary functions to approximate subspace
invariance in Koopman operators [0.0]
Deep learning combined with EDMD has been used to learn novel dictionary functions in an algorithm called deep dynamic mode decomposition (deepDMD)
We discover a novel class of dictionary functions to approximate Koopman observables.
Our results provide a hypothesis to explain the success of deep neural networks in learning numerical approximations to Koopman operators.
arXiv Detail & Related papers (2022-06-27T19:04:03Z) - Generalizing Dynamic Mode Decomposition: Balancing Accuracy and
Expressiveness in Koopman Approximations [0.0]
This paper tackles the data-driven approximation of unknown dynamical systems using Koopman-operator methods.
We propose the Tunable Symmetric Subspace Decomposition algorithm to refine the dictionary.
We provide a full characterization of the algorithm properties and show that it generalizes both Extended Dynamic Mode Decomposition and Symmetric Subspace Decomposition.
arXiv Detail & Related papers (2021-08-08T19:11:41Z) - Estimating Koopman operators for nonlinear dynamical systems: a
nonparametric approach [77.77696851397539]
The Koopman operator is a mathematical tool that allows for a linear description of non-linear systems.
In this paper we capture their core essence as a dual version of the same framework, incorporating them into the Kernel framework.
We establish a strong link between kernel methods and Koopman operators, leading to the estimation of the latter through Kernel functions.
arXiv Detail & Related papers (2021-03-25T11:08:26Z) - Model Fusion with Kullback--Leibler Divergence [58.20269014662046]
We propose a method to fuse posterior distributions learned from heterogeneous datasets.
Our algorithm relies on a mean field assumption for both the fused model and the individual dataset posteriors.
arXiv Detail & Related papers (2020-07-13T03:27:45Z) - Anchor & Transform: Learning Sparse Embeddings for Large Vocabularies [60.285091454321055]
We design a simple and efficient embedding algorithm that learns a small set of anchor embeddings and a sparse transformation matrix.
On text classification, language modeling, and movie recommendation benchmarks, we show that ANT is particularly suitable for large vocabulary sizes.
arXiv Detail & Related papers (2020-03-18T13:07:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.