Autonomous Kinetic Modeling of Biomass Pyrolysis using Chemical Reaction
Neural Networks
- URL: http://arxiv.org/abs/2105.11397v1
- Date: Mon, 24 May 2021 16:38:40 GMT
- Title: Autonomous Kinetic Modeling of Biomass Pyrolysis using Chemical Reaction
Neural Networks
- Authors: Weiqi Ji, Franz Richter, Michael J. Gollner, Sili Deng
- Abstract summary: Modeling the burning processes of biomass such as wood, grass, and crops is crucial for the modeling and prediction of wildland and urban fire behavior.
This work presents a framework for autonomously discovering biomass pyrolysis kinetic models from thermogravimetric analyzer (TGA) experimental data using the recently developed chemical reaction neural networks (CRNN)
CRNN is fully interpretable, by incorporating the fundamental physics laws, such as the law of mass action and Arrhenius law, into the neural network structure.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Modeling the burning processes of biomass such as wood, grass, and crops is
crucial for the modeling and prediction of wildland and urban fire behavior.
Despite its importance, the burning of solid fuels remains poorly understood,
which can be partly attributed to the unknown chemical kinetics of most solid
fuels. Most available kinetic models were built upon expert knowledge, which
requires chemical insights and years of experience. This work presents a
framework for autonomously discovering biomass pyrolysis kinetic models from
thermogravimetric analyzer (TGA) experimental data using the recently developed
chemical reaction neural networks (CRNN). The approach incorporated the CRNN
model into the framework of neural ordinary differential equations to predict
the residual mass in TGA data. In addition to the flexibility of
neural-network-based models, the learned CRNN model is fully interpretable, by
incorporating the fundamental physics laws, such as the law of mass action and
Arrhenius law, into the neural network structure. The learned CRNN model can
then be translated into the classical forms of biomass chemical kinetic models,
which facilitates the extraction of chemical insights and the integration of
the kinetic model into large-scale fire simulations. We demonstrated the
effectiveness of the framework in predicting the pyrolysis and oxidation of
cellulose. This successful demonstration opens the possibility of rapid and
autonomous chemical kinetic modeling of solid fuels, such as wildfire fuels and
industrial polymers.
Related papers
- ChemKANs for Combustion Chemistry Modeling and Acceleration [0.0]
Machine learning techniques have been proposed to streamline chemical kinetic model inference.
ChemKAN can accurately represent hydrogen combustion chemistry, providing a 2x acceleration over the detailed chemistry in a solver.
These demonstrations indicate potential for ChemKANs in combustion physics and chemical kinetics.
arXiv Detail & Related papers (2025-04-17T01:53:28Z) - Learning Chemical Reaction Representation with Reactant-Product Alignment [50.28123475356234]
This paper introduces modelname, a novel chemical reaction representation learning model tailored for a variety of organic-reaction-related tasks.
By integrating atomic correspondence between reactants and products, our model discerns the molecular transformations that occur during the reaction, thereby enhancing the comprehension of the reaction mechanism.
We have designed an adapter structure to incorporate reaction conditions into the chemical reaction representation, allowing the model to handle diverse reaction conditions and adapt to various datasets and downstream tasks, e.g., reaction performance prediction.
arXiv Detail & Related papers (2024-11-26T17:41:44Z) - Physics-Informed Neural Networks with Hard Linear Equality Constraints [9.101849365688905]
This work proposes a novel physics-informed neural network, KKT-hPINN, which rigorously guarantees hard linear equality constraints.
Experiments on Aspen models of a stirred-tank reactor unit, an extractive distillation subsystem, and a chemical plant demonstrate that this model can further enhance the prediction accuracy.
arXiv Detail & Related papers (2024-02-11T17:40:26Z) - Gibbs-Duhem-Informed Neural Networks for Binary Activity Coefficient
Prediction [45.84205238554709]
We propose Gibbs-Duhem-informed neural networks for the prediction of binary activity coefficients at varying compositions.
We include the Gibbs-Duhem equation explicitly in the loss function for training neural networks.
arXiv Detail & Related papers (2023-05-31T07:36:45Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Toward Development of Machine Learned Techniques for Production of
Compact Kinetic Models [0.0]
Chemical kinetic models are an essential component in the development and optimisation of combustion devices.
We present a novel automated compute intensification methodology to produce overly-reduced and optimised chemical kinetic models.
arXiv Detail & Related papers (2022-02-16T12:31:24Z) - Prediction of liquid fuel properties using machine learning models with
Gaussian processes and probabilistic conditional generative learning [56.67751936864119]
The present work aims to construct cheap-to-compute machine learning (ML) models to act as closure equations for predicting the physical properties of alternative fuels.
Those models can be trained using the database from MD simulations and/or experimental measurements in a data-fusion-fidelity approach.
The results show that ML models can predict accurately the fuel properties of a wide range of pressure and temperature conditions.
arXiv Detail & Related papers (2021-10-18T14:43:50Z) - Mapping and Validating a Point Neuron Model on Intel's Neuromorphic
Hardware Loihi [77.34726150561087]
We investigate the potential of Intel's fifth generation neuromorphic chip - Loihi'
Loihi is based on the novel idea of Spiking Neural Networks (SNNs) emulating the neurons in the brain.
We find that Loihi replicates classical simulations very efficiently and scales notably well in terms of both time and energy performance as the networks get larger.
arXiv Detail & Related papers (2021-09-22T16:52:51Z) - Kinetics-Informed Neural Networks [0.0]
We use feed-forward artificial neural networks as basis functions for the construction of surrogate models to solve ordinary differential equations.
We show that the simultaneous training of neural nets and kinetic model parameters in a regularized multiobjective optimization setting leads to the solution of the inverse problem.
This surrogate approach to inverse kinetic ODEs can assist in the elucidation of reaction mechanisms based on transient data.
arXiv Detail & Related papers (2020-11-30T00:07:09Z) - Sobolev training of thermodynamic-informed neural networks for smoothed
elasto-plasticity models with level set hardening [0.0]
We introduce a deep learning framework designed to train smoothed elastoplasticity models with interpretable components.
By recasting the yield function as an evolving level set, we introduce a machine learning approach to predict the solutions of the Hamilton-Jacobi equation.
arXiv Detail & Related papers (2020-10-15T22:43:32Z) - Parsimonious neural networks learn interpretable physical laws [77.34726150561087]
We propose parsimonious neural networks (PNNs) that combine neural networks with evolutionary optimization to find models that balance accuracy with parsimony.
The power and versatility of the approach is demonstrated by developing models for classical mechanics and to predict the melting temperature of materials from fundamental properties.
arXiv Detail & Related papers (2020-05-08T16:15:47Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.