Learning Dynamical Systems from Data: A Simple Cross-Validation
Perspective, Part V: Sparse Kernel Flows for 132 Chaotic Dynamical Systems
- URL: http://arxiv.org/abs/2301.10321v1
- Date: Tue, 24 Jan 2023 21:47:33 GMT
- Title: Learning Dynamical Systems from Data: A Simple Cross-Validation
Perspective, Part V: Sparse Kernel Flows for 132 Chaotic Dynamical Systems
- Authors: Lu Yang and Xiuwen Sun and Boumediene Hamzi and Houman Owhadi and
Naiming Xie
- Abstract summary: We introduce the method of emphSparse Kernel Flows in order to learn the best'' kernel by starting from a large dictionary of kernels.
We apply this approach to a library of 132 chaotic systems.
- Score: 5.124035247669094
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Regressing the vector field of a dynamical system from a finite number of
observed states is a natural way to learn surrogate models for such systems. A
simple and interpretable way to learn a dynamical system from data is to
interpolate its vector-field with a data-adapted kernel which can be learned by
using Kernel Flows. The method of Kernel Flows is a trainable machine learning
method that learns the optimal parameters of a kernel based on the premise that
a kernel is good if there is no significant loss in accuracy if half of the
data is used. The objective function could be a short-term prediction or some
other objective for other variants of Kernel Flows). However, this method is
limited by the choice of the base kernel. In this paper, we introduce the
method of \emph{Sparse Kernel Flows } in order to learn the ``best'' kernel by
starting from a large dictionary of kernels. It is based on sparsifying a
kernel that is a linear combination of elemental kernels. We apply this
approach to a library of 132 chaotic systems.
Related papers
- Kernel Sum of Squares for Data Adapted Kernel Learning of Dynamical Systems from Data: A global optimization approach [0.19999259391104385]
This paper examines the application of the Kernel Sum of Squares (KSOS) method for enhancing kernel learning from data.
Traditional kernel-based methods frequently struggle with selecting optimal base kernels and parameter tuning.
KSOS mitigates these issues by leveraging a global optimization framework with kernel-based surrogate functions.
arXiv Detail & Related papers (2024-08-12T19:32:28Z) - RFFNet: Large-Scale Interpretable Kernel Methods via Random Fourier Features [3.0079490585515347]
We introduce RFFNet, a scalable method that learns the kernel relevances' on the fly via first-order optimization.
We show that our approach has a small memory footprint and run-time, low prediction error, and effectively identifies relevant features.
We supply users with an efficient, PyTorch-based library, that adheres to the scikit-learn standard API and code for fully reproducing our results.
arXiv Detail & Related papers (2022-11-11T18:50:34Z) - Learning "best" kernels from data in Gaussian process regression. With
application to aerodynamics [0.4588028371034406]
We introduce algorithms to select/design kernels in Gaussian process regression/kriging surrogate modeling techniques.
A first class of algorithms is kernel flow, which was introduced in a context of classification in machine learning.
A second class of algorithms is called spectral kernel ridge regression, and aims at selecting a "best" kernel such that the norm of the function to be approximated is minimal.
arXiv Detail & Related papers (2022-06-03T07:50:54Z) - Neural Fields as Learnable Kernels for 3D Reconstruction [101.54431372685018]
We present a novel method for reconstructing implicit 3D shapes based on a learned kernel ridge regression.
Our technique achieves state-of-the-art results when reconstructing 3D objects and large scenes from sparse oriented points.
arXiv Detail & Related papers (2021-11-26T18:59:04Z) - Learning dynamical systems from data: A simple cross-validation perspective, part III: Irregularly-Sampled Time Series [8.918419734720613]
A simple and interpretable way to learn a dynamical system from data is to interpolate its vector-field with a kernel.
Despite its previous successes, this strategy breaks down when the observed time series is not regularly sampled in time.
We propose to address this problem by directly approxing the vector field of the dynamical system by incorporating time differences in the (KF) data-adapted kernels.
arXiv Detail & Related papers (2021-11-25T11:45:40Z) - Neural Networks as Kernel Learners: The Silent Alignment Effect [86.44610122423994]
Neural networks in the lazy training regime converge to kernel machines.
We show that this can indeed happen due to a phenomenon we term silent alignment.
We also demonstrate that non-whitened data can weaken the silent alignment effect.
arXiv Detail & Related papers (2021-10-29T18:22:46Z) - Understanding of Kernels in CNN Models by Suppressing Irrelevant Visual
Features in Images [55.60727570036073]
The lack of precisely interpreting kernels in convolutional neural networks (CNNs) is one main obstacle to wide applications of deep learning models in real scenarios.
A simple yet effective optimization method is proposed to interpret the activation of any kernel of interest in CNN models.
arXiv Detail & Related papers (2021-08-25T05:48:44Z) - Kernel Continual Learning [117.79080100313722]
kernel continual learning is a simple but effective variant of continual learning to tackle catastrophic forgetting.
episodic memory unit stores a subset of samples for each task to learn task-specific classifiers based on kernel ridge regression.
variational random features to learn a data-driven kernel for each task.
arXiv Detail & Related papers (2021-07-12T22:09:30Z) - Taming Nonconvexity in Kernel Feature Selection---Favorable Properties
of the Laplace Kernel [77.73399781313893]
A challenge is to establish the objective function of kernel-based feature selection.
The gradient-based algorithms available for non-global optimization are only able to guarantee convergence to local minima.
arXiv Detail & Related papers (2021-06-17T11:05:48Z) - Kernel Identification Through Transformers [54.3795894579111]
Kernel selection plays a central role in determining the performance of Gaussian Process (GP) models.
This work addresses the challenge of constructing custom kernel functions for high-dimensional GP regression models.
We introduce a novel approach named KITT: Kernel Identification Through Transformers.
arXiv Detail & Related papers (2021-06-15T14:32:38Z) - Face Verification via learning the kernel matrix [9.414572104591027]
kernel function is introduced to solve the nonlinear pattern recognition problem.
A promising approach is to learn the kernel from data automatically.
In this paper, the nonlinear face verification via learning the kernel matrix is proposed.
arXiv Detail & Related papers (2020-01-21T03:39:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.