Differentiable Programming of Chemical Reaction Networks
- URL: http://arxiv.org/abs/2302.02714v1
- Date: Mon, 6 Feb 2023 11:41:14 GMT
- Title: Differentiable Programming of Chemical Reaction Networks
- Authors: Alexander Mordvintsev, Ettore Randazzo, Eyvind Niklasson
- Abstract summary: Chemical reaction networks are one of the most fundamental computational substrates used by nature.
We study well-mixed single-chamber systems, as well as systems with multiple chambers separated by membranes.
We demonstrate that differentiable optimisation, combined with proper regularisation, can discover non-trivial sparse reaction networks.
- Score: 63.948465205530916
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: We present a differentiable formulation of abstract chemical reaction
networks (CRNs) that can be trained to solve a variety of computational tasks.
Chemical reaction networks are one of the most fundamental computational
substrates used by nature. We study well-mixed single-chamber systems, as well
as systems with multiple chambers separated by membranes, under mass-action
kinetics. We demonstrate that differentiable optimisation, combined with proper
regularisation, can discover non-trivial sparse reaction networks that can
implement various sorts of oscillators and other chemical computing devices.
Related papers
- Autonomous Learning of Generative Models with Chemical Reaction Network
Ensembles [0.0]
We develop a general architecture whereby a broad class of chemical systems can autonomously learn complex distributions.
Our construction takes the form of a chemical implementation of machine learning's optimization workhorse: gradient descent on the relative entropy cost function.
arXiv Detail & Related papers (2023-11-02T03:46:23Z) - Probing reaction channels via reinforcement learning [4.523974776690403]
We propose a reinforcement learning based method to identify important configurations that connect reactant and product states along chemical reaction paths.
By shooting multiple trajectories from these configurations, we can generate an ensemble of configurations that concentrate on the transition path ensemble.
The resulting solution, known as the committor function, encodes mechanistic information for the reaction and can in turn be used to evaluate reaction rates.
arXiv Detail & Related papers (2023-05-27T17:22:32Z) - Permutation Equivariant Neural Functionals [92.0667671999604]
This work studies the design of neural networks that can process the weights or gradients of other neural networks.
We focus on the permutation symmetries that arise in the weights of deep feedforward networks because hidden layer neurons have no inherent order.
In our experiments, we find that permutation equivariant neural functionals are effective on a diverse set of tasks.
arXiv Detail & Related papers (2023-02-27T18:52:38Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Neural-network solutions to stochastic reaction networks [7.021105583098606]
We propose a machine-learning approach using the variational autoregressive network to solve the chemical master equation.
The proposed approach tracks the time evolution of the joint probability distribution in the state space of species counts.
We demonstrate that it accurately generates the probability distribution over time in the genetic toggle switch and the early life self-replicator.
arXiv Detail & Related papers (2022-09-29T07:27:59Z) - Modeling Diverse Chemical Reactions for Single-step Retrosynthesis via
Discrete Latent Variables [43.900173434781905]
The goal of single-step retrosynthesis is to identify the possible reactants that lead to the synthesis of the target product in one reaction.
Existing sequence-based retrosynthetic methods treat the product-to-reactant retrosynthesis as a sequence-to-sequence translation problem.
We propose RetroDVCAE, which incorporates conditional variational autoencoders into single-step retrosynthesis and associates discrete latent variables with the generation process.
arXiv Detail & Related papers (2022-08-10T14:50:32Z) - A Grid-Structured Model of Tubular Reactors [61.38002492702646]
The proposed model may be entirely based on the known form of the partial differential equations or it may contain generic machine learning components such as multi-layer perceptrons.
We show that the proposed model can be trained using limited amounts of data to describe the state of a fixed-bed catalytic reactor.
arXiv Detail & Related papers (2021-12-13T19:54:23Z) - Programming and Training Rate-Independent Chemical Reaction Networks [9.001036626196258]
Natural biochemical systems are typically modeled by chemical reaction networks (CRNs)
CRNs can be used as a specification language for synthetic chemical computation.
We show a technique to program NC-CRNs using well-founded deep learning methods.
arXiv Detail & Related papers (2021-09-20T15:31:03Z) - Kinetics-Informed Neural Networks [0.0]
We use feed-forward artificial neural networks as basis functions for the construction of surrogate models to solve ordinary differential equations.
We show that the simultaneous training of neural nets and kinetic model parameters in a regularized multiobjective optimization setting leads to the solution of the inverse problem.
This surrogate approach to inverse kinetic ODEs can assist in the elucidation of reaction mechanisms based on transient data.
arXiv Detail & Related papers (2020-11-30T00:07:09Z) - Exact representations of many body interactions with RBM neural networks [77.34726150561087]
We exploit the representation power of RBMs to provide an exact decomposition of many-body contact interactions into one-body operators.
This construction generalizes the well known Hirsch's transform used for the Hubbard model to more complicated theories such as Pionless EFT in nuclear physics.
arXiv Detail & Related papers (2020-05-07T15:59:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.