Physical Symbolic Optimization
- URL: http://arxiv.org/abs/2312.03612v1
- Date: Wed, 6 Dec 2023 16:56:28 GMT
- Title: Physical Symbolic Optimization
- Authors: Wassim Tenachi, Rodrigo Ibata, Foivos I. Diakogiannis
- Abstract summary: We present a framework for constraining the automatic sequential generation of equations to obey the rules of dimensional analysis by construction.
Our symbolic regression algorithm achieves state-of-the-art results in contexts in which variables and constants have known physical units.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present a framework for constraining the automatic sequential generation
of equations to obey the rules of dimensional analysis by construction.
Combining this approach with reinforcement learning, we built $\Phi$-SO, a
Physical Symbolic Optimization method for recovering analytical functions from
physical data leveraging units constraints. Our symbolic regression algorithm
achieves state-of-the-art results in contexts in which variables and constants
have known physical units, outperforming all other methods on SRBench's Feynman
benchmark in the presence of noise (exceeding 0.1%) and showing resilience even
in the presence of significant (10%) levels of noise.
Related papers
- Class Symbolic Regression: Gotta Fit 'Em All [0.0]
We introduce 'Class Symbolic Regression' (Class SR) a first framework for automatically finding a single analytical functional form that accurately fits multiple datasets.
This hierarchical framework leverages the common constraint that all the members of a single class of physical phenomena follow a common governing law.
We introduce the first Class SR benchmark, comprising a series of synthetic physical challenges specifically designed to evaluate such algorithms.
arXiv Detail & Related papers (2023-12-04T11:45:44Z) - Discovering Interpretable Physical Models using Symbolic Regression and
Discrete Exterior Calculus [55.2480439325792]
We propose a framework that combines Symbolic Regression (SR) and Discrete Exterior Calculus (DEC) for the automated discovery of physical models.
DEC provides building blocks for the discrete analogue of field theories, which are beyond the state-of-the-art applications of SR to physical problems.
We prove the effectiveness of our methodology by re-discovering three models of Continuum Physics from synthetic experimental data.
arXiv Detail & Related papers (2023-10-10T13:23:05Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Deep symbolic regression for physics guided by units constraints: toward
the automated discovery of physical laws [0.0]
Symbolic Regression is the study of algorithms that automate the search for analytic expressions that fit data.
We present $Phi$-SO, a framework for recovering analytical symbolic expressions from physics data.
arXiv Detail & Related papers (2023-03-06T16:47:59Z) - Auxiliary Functions as Koopman Observables: Data-Driven Analysis of
Dynamical Systems via Polynomial Optimization [0.0]
We present a flexible data-driven method for system analysis that does not require explicit model discovery.
The method is rooted in well-established techniques for approxing the Koopman operator from data and is implemented as a semidefinite program that can be solved numerically.
arXiv Detail & Related papers (2023-03-02T18:44:18Z) - Guaranteed Conservation of Momentum for Learning Particle-based Fluid
Dynamics [96.9177297872723]
We present a novel method for guaranteeing linear momentum in learned physics simulations.
We enforce conservation of momentum with a hard constraint, which we realize via antisymmetrical continuous convolutional layers.
In combination, the proposed method allows us to increase the physical accuracy of the learned simulator substantially.
arXiv Detail & Related papers (2022-10-12T09:12:59Z) - Label noise (stochastic) gradient descent implicitly solves the Lasso
for quadratic parametrisation [14.244787327283335]
We study the role of the label noise in the training dynamics of a quadratically parametrised model through its continuous time version.
Our findings highlight the fact that structured noise can induce better generalisation and help explain the greater performances of dynamics as observed in practice.
arXiv Detail & Related papers (2022-06-20T15:24:42Z) - A Priori Denoising Strategies for Sparse Identification of Nonlinear
Dynamical Systems: A Comparative Study [68.8204255655161]
We investigate and compare the performance of several local and global smoothing techniques to a priori denoise the state measurements.
We show that, in general, global methods, which use the entire measurement data set, outperform local methods, which employ a neighboring data subset around a local point.
arXiv Detail & Related papers (2022-01-29T23:31:25Z) - Fractal Structure and Generalization Properties of Stochastic
Optimization Algorithms [71.62575565990502]
We prove that the generalization error of an optimization algorithm can be bounded on the complexity' of the fractal structure that underlies its generalization measure.
We further specialize our results to specific problems (e.g., linear/logistic regression, one hidden/layered neural networks) and algorithms.
arXiv Detail & Related papers (2021-06-09T08:05:36Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.