Algebraic Dynamical Systems in Machine Learning
- URL: http://arxiv.org/abs/2311.03118v1
- Date: Mon, 6 Nov 2023 14:10:40 GMT
- Title: Algebraic Dynamical Systems in Machine Learning
- Authors: Iolo Jones, Jerry Swan, and Jeffrey Giansiracusa
- Abstract summary: We show that a function applied to the output of an iterated rewriting system defines a formal class of models.
We also show that these algebraic models are a natural language for describing the compositionality of dynamic models.
These models provide a template for the generalisation of the above dynamic models to learning problems on structured or non-numerical data.
- Score: 0.1843404256219181
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce an algebraic analogue of dynamical systems, based on term
rewriting. We show that a recursive function applied to the output of an
iterated rewriting system defines a formal class of models into which all the
main architectures for dynamic machine learning models (including recurrent
neural networks, graph neural networks, and diffusion models) can be embedded.
Considered in category theory, we also show that these algebraic models are a
natural language for describing the compositionality of dynamic models.
Furthermore, we propose that these models provide a template for the
generalisation of the above dynamic models to learning problems on structured
or non-numerical data, including 'hybrid symbolic-numeric' models.
Related papers
- Deep Koopman-layered Model with Universal Property Based on Toeplitz Matrices [26.96258010698567]
The proposed model has both theoretical solidness and flexibility.
The flexibility of the proposed model enables the model to fit time-series data coming from nonautonomous dynamical systems.
arXiv Detail & Related papers (2024-10-03T04:27:46Z) - Learnable & Interpretable Model Combination in Dynamic Systems Modeling [0.0]
We discuss which types of models are usually combined and propose a model interface that is capable of expressing a variety of mixed equation based models.
We propose a new wildcard topology, that is capable of describing the generic connection between two combined models in an easy to interpret fashion.
The contributions of this paper are highlighted at a proof of concept: Different connection topologies between two models are learned, interpreted and compared.
arXiv Detail & Related papers (2024-06-12T11:17:11Z) - Discovering Interpretable Physical Models using Symbolic Regression and
Discrete Exterior Calculus [55.2480439325792]
We propose a framework that combines Symbolic Regression (SR) and Discrete Exterior Calculus (DEC) for the automated discovery of physical models.
DEC provides building blocks for the discrete analogue of field theories, which are beyond the state-of-the-art applications of SR to physical problems.
We prove the effectiveness of our methodology by re-discovering three models of Continuum Physics from synthetic experimental data.
arXiv Detail & Related papers (2023-10-10T13:23:05Z) - Discrete, compositional, and symbolic representations through attractor dynamics [51.20712945239422]
We introduce a novel neural systems model that integrates attractor dynamics with symbolic representations to model cognitive processes akin to the probabilistic language of thought (PLoT)
Our model segments the continuous representational space into discrete basins, with attractor states corresponding to symbolic sequences, that reflect the semanticity and compositionality characteristic of symbolic systems through unsupervised learning, rather than relying on pre-defined primitives.
This approach establishes a unified framework that integrates both symbolic and sub-symbolic processing through neural dynamics, a neuroplausible substrate with proven expressivity in AI, offering a more comprehensive model that mirrors the complex duality of cognitive operations
arXiv Detail & Related papers (2023-10-03T05:40:56Z) - Physical Modeling using Recurrent Neural Networks with Fast
Convolutional Layers [1.7013938542585922]
We describe several novel recurrent neural network structures and show how they can be thought of as an extension of modal techniques.
As a proof of concept, we generate synthetic data for three physical systems and show that the proposed network structures can be trained with this data to reproduce the behavior of these systems.
arXiv Detail & Related papers (2022-04-21T14:22:44Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - Constructing Neural Network-Based Models for Simulating Dynamical
Systems [59.0861954179401]
Data-driven modeling is an alternative paradigm that seeks to learn an approximation of the dynamics of a system using observations of the true system.
This paper provides a survey of the different ways to construct models of dynamical systems using neural networks.
In addition to the basic overview, we review the related literature and outline the most significant challenges from numerical simulations that this modeling paradigm must overcome.
arXiv Detail & Related papers (2021-11-02T10:51:42Z) - Model-agnostic multi-objective approach for the evolutionary discovery
of mathematical models [55.41644538483948]
In modern data science, it is more interesting to understand the properties of the model, which parts could be replaced to obtain better results.
We use multi-objective evolutionary optimization for composite data-driven model learning to obtain the algorithm's desired properties.
arXiv Detail & Related papers (2021-07-07T11:17:09Z) - Artificial neural network as a universal model of nonlinear dynamical
systems [0.0]
The map is built as an artificial neural network whose weights encode a modeled system.
We consider the Lorenz system, the Roessler system and also Hindmarch-Rose neuron.
High similarity is observed for visual images of attractors, power spectra, bifurcation diagrams and Lyapunovs exponents.
arXiv Detail & Related papers (2021-03-06T16:02:41Z) - S2RMs: Spatially Structured Recurrent Modules [105.0377129434636]
We take a step towards exploiting dynamic structure that are capable of simultaneously exploiting both modular andtemporal structures.
We find our models to be robust to the number of available views and better capable of generalization to novel tasks without additional training.
arXiv Detail & Related papers (2020-07-13T17:44:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.