Symbolic Regression on Sparse and Noisy Data with Gaussian Processes
- URL: http://arxiv.org/abs/2309.11076v2
- Date: Thu, 28 Mar 2024 01:00:05 GMT
- Title: Symbolic Regression on Sparse and Noisy Data with Gaussian Processes
- Authors: Junette Hsin, Shubhankar Agarwal, Adam Thorpe, Luis Sentis, David Fridovich-Keil,
- Abstract summary: We use a sparse identification of nonlinear dynamics (SINDy) method to denoise the data and identify nonlinear dynamical equations.
Our simple approach offers improved robustness with sparse, noisy data compared to SINDy alone.
We show superior performance over baselines including 20.78% improvement over SINDy and 61.92% improvement over SSR in predicting future trajectories.
- Score: 11.413977318301903
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we address the challenge of deriving dynamical models from sparse and noisy data. High-quality data is crucial for symbolic regression algorithms; limited and noisy data can present modeling challenges. To overcome this, we combine Gaussian process regression with a sparse identification of nonlinear dynamics (SINDy) method to denoise the data and identify nonlinear dynamical equations. Our simple approach offers improved robustness with sparse, noisy data compared to SINDy alone. We demonstrate its effectiveness on a Lotka-Volterra model, a unicycle dynamic model in simulation, and hardware data from an NVIDIA JetRacer system. We show superior performance over baselines including 20.78% improvement over SINDy and 61.92% improvement over SSR in predicting future trajectories from discovered dynamics.
Related papers
- How more data can hurt: Instability and regularization in next-generation reservoir computing [0.0]
We show that a more extreme version of the phenomenon occurs in data-driven models of dynamical systems.
We find that, despite learning a better representation of the flow map with more training data, NGRC can adopt an ill-conditioned integrator'' and lose stability.
arXiv Detail & Related papers (2024-07-11T16:22:13Z) - Bridging the Sim-to-Real Gap with Bayesian Inference [53.61496586090384]
We present SIM-FSVGD for learning robot dynamics from data.
We use low-fidelity physical priors to regularize the training of neural network models.
We demonstrate the effectiveness of SIM-FSVGD in bridging the sim-to-real gap on a high-performance RC racecar system.
arXiv Detail & Related papers (2024-03-25T11:29:32Z) - A Multi-step Loss Function for Robust Learning of the Dynamics in
Model-based Reinforcement Learning [10.940666275830052]
In model-based reinforcement learning, most algorithms rely on simulating trajectories from one-step models of the dynamics learned on data.
We tackle this issue by using a multi-step objective to train one-step models.
We find that this new loss is particularly useful when the data is noisy, which is often the case in real-life environments.
arXiv Detail & Related papers (2024-02-05T16:13:00Z) - Conditional Denoising Diffusion for Sequential Recommendation [62.127862728308045]
Two prominent generative models, Generative Adversarial Networks (GANs) and Variational AutoEncoders (VAEs)
GANs suffer from unstable optimization, while VAEs are prone to posterior collapse and over-smoothed generations.
We present a conditional denoising diffusion model, which includes a sequence encoder, a cross-attentive denoising decoder, and a step-wise diffuser.
arXiv Detail & Related papers (2023-04-22T15:32:59Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Learning Sparse Nonlinear Dynamics via Mixed-Integer Optimization [3.7565501074323224]
We propose an exact formulation of the SINDyDy problem using mixed-integer optimization (MIO) to solve the sparsity constrained regression problem to provable optimality in seconds.
We illustrate the dramatic improvement of our approach in accurate model discovery while being more sample efficient, robust to noise, and flexible in accommodating physical constraints.
arXiv Detail & Related papers (2022-06-01T01:43:45Z) - A Priori Denoising Strategies for Sparse Identification of Nonlinear
Dynamical Systems: A Comparative Study [68.8204255655161]
We investigate and compare the performance of several local and global smoothing techniques to a priori denoise the state measurements.
We show that, in general, global methods, which use the entire measurement data set, outperform local methods, which employ a neighboring data subset around a local point.
arXiv Detail & Related papers (2022-01-29T23:31:25Z) - `Next Generation' Reservoir Computing: an Empirical Data-Driven
Expression of Dynamical Equations in Time-Stepping Form [0.0]
Next generation reservoir computing based on nonlinear vector autoregression is applied to emulate simple dynamical system models.
It is also shown that the approach can be extended to produce high-order numerical schemes directly from data.
The impacts of the presence of noise and temporal sparsity in the training set is examined to gauge the potential use of this method for more realistic applications.
arXiv Detail & Related papers (2022-01-13T20:13:33Z) - Learning Unstable Dynamics with One Minute of Data: A
Differentiation-based Gaussian Process Approach [47.045588297201434]
We show how to exploit the differentiability of Gaussian processes to create a state-dependent linearized approximation of the true continuous dynamics.
We validate our approach by iteratively learning the system dynamics of an unstable system such as a 9-D segway.
arXiv Detail & Related papers (2021-03-08T05:08:47Z) - Learning Stable Nonparametric Dynamical Systems with Gaussian Process
Regression [9.126353101382607]
We learn a nonparametric Lyapunov function based on Gaussian process regression from data.
We prove that stabilization of the nominal model based on the nonparametric control Lyapunov function does not modify the behavior of the nominal model at training samples.
arXiv Detail & Related papers (2020-06-14T11:17:17Z) - Convolutional Tensor-Train LSTM for Spatio-temporal Learning [116.24172387469994]
We propose a higher-order LSTM model that can efficiently learn long-term correlations in the video sequence.
This is accomplished through a novel tensor train module that performs prediction by combining convolutional features across time.
Our results achieve state-of-the-art performance-art in a wide range of applications and datasets.
arXiv Detail & Related papers (2020-02-21T05:00:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.