Inferring processes within dynamic forest models using hybrid modeling
- URL: http://arxiv.org/abs/2508.01228v1
- Date: Sat, 02 Aug 2025 06:46:37 GMT
- Title: Inferring processes within dynamic forest models using hybrid modeling
- Authors: Maximilian Pichler, Yannek Käber,
- Abstract summary: We introduce Forest Informed Neural Networks (FINN), a hybrid modeling approach that combines a forest gap model with deep neural networks (DNN)<n>We demonstrate that replacing the growth process with a DNN improves predictive performance and succession trajectories compared to a fully mechanistic version of FINN.<n>In conclusion, our new hybrid modeling approach offers a versatile opportunity to infer forest dynamics from data and to improve forecasts of ecosystem trajectories under unprecedented environmental change.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Modeling forest dynamics under novel climatic conditions requires a careful balance between process-based understanding and empirical flexibility. Dynamic Vegetation Models (DVM) represent ecological processes mechanistically, but their performance is prone to misspecified assumptions about functional forms. Inferring the structure of these processes and their functional forms correctly from data remains a major challenge because current approaches, such as plug-in estimators, have proven ineffective. We introduce Forest Informed Neural Networks (FINN), a hybrid modeling approach that combines a forest gap model with deep neural networks (DNN). FINN replaces processes with DNNs, which are then calibrated alongside the other mechanistic components in one unified step. In a case study on the Barro Colorado Island 50-ha plot we demonstrate that replacing the growth process with a DNN improves predictive performance and succession trajectories compared to a fully mechanistic version of FINN. Furthermore, we discovered that the DNN learned an ecologically plausible, improved functional form of growth, which we extracted from the DNN using explainable AI. In conclusion, our new hybrid modeling approach offers a versatile opportunity to infer forest dynamics from data and to improve forecasts of ecosystem trajectories under unprecedented environmental change.
Related papers
- Langevin Flows for Modeling Neural Latent Dynamics [81.81271685018284]
We introduce LangevinFlow, a sequential Variational Auto-Encoder where the time evolution of latent variables is governed by the underdamped Langevin equation.<n>Our approach incorporates physical priors -- such as inertia, damping, a learned potential function, and forces -- to represent both autonomous and non-autonomous processes in neural systems.<n>Our method outperforms state-of-the-art baselines on synthetic neural populations generated by a Lorenz attractor.
arXiv Detail & Related papers (2025-07-15T17:57:48Z) - Variational Graph Convolutional Neural Networks [72.67088029389764]
Uncertainty can help improve the explainability of Graph Convolutional Networks.<n>Uncertainty can also be used in critical applications to verify the results of the model.
arXiv Detail & Related papers (2025-07-02T13:28:37Z) - Modelling Mosquito Population Dynamics using PINN-derived Empirical Parameters [5.585625844344932]
We focus on improving the parameterisation of biological processes in mechanistic models using PINNs to determine inverse parameters.<n> PINNs embed physical, biological, or chemical laws into neural networks trained on observed or measured data.<n>For a deeper understanding of the performance of PINN models, a final validation was used to investigate how modifications to PINN architectures affect the performance of the framework.
arXiv Detail & Related papers (2024-12-10T13:51:48Z) - Autaptic Synaptic Circuit Enhances Spatio-temporal Predictive Learning of Spiking Neural Networks [23.613277062707844]
Spiking Neural Networks (SNNs) emulate the integrated-fire-leak mechanism found in biological neurons.
Existing SNNs predominantly rely on the Integrate-and-Fire Leaky (LIF) model.
This paper proposes a novel S-patioTemporal Circuit (STC) model.
arXiv Detail & Related papers (2024-06-01T11:17:27Z) - Self Expanding Convolutional Neural Networks [1.4330085996657045]
We present a novel method for dynamically expanding Convolutional Neural Networks (CNNs) during training.
We employ a strategy where a single model is dynamically expanded, facilitating the extraction of checkpoints at various complexity levels.
arXiv Detail & Related papers (2024-01-11T06:22:40Z) - ConCerNet: A Contrastive Learning Based Framework for Automated
Conservation Law Discovery and Trustworthy Dynamical System Prediction [82.81767856234956]
This paper proposes a new learning framework named ConCerNet to improve the trustworthiness of the DNN based dynamics modeling.
We show that our method consistently outperforms the baseline neural networks in both coordinate error and conservation metrics.
arXiv Detail & Related papers (2023-02-11T21:07:30Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Disentangled Generative Models for Robust Prediction of System Dynamics [2.6424064030995957]
In this work, we treat the domain parameters of dynamical systems as factors of variation of the data generating process.
By leveraging ideas from supervised disentanglement and causal factorization, we aim to separate the domain parameters from the dynamics in the latent space of generative models.
Results indicate that disentangled VAEs adapt better to domain parameters spaces that were not present in the training data.
arXiv Detail & Related papers (2021-08-26T09:58:06Z) - Sparse Flows: Pruning Continuous-depth Models [107.98191032466544]
We show that pruning improves generalization for neural ODEs in generative modeling.
We also show that pruning finds minimal and efficient neural ODE representations with up to 98% less parameters compared to the original network, without loss of accuracy.
arXiv Detail & Related papers (2021-06-24T01:40:17Z) - Modeling from Features: a Mean-field Framework for Over-parameterized
Deep Neural Networks [54.27962244835622]
This paper proposes a new mean-field framework for over- parameterized deep neural networks (DNNs)
In this framework, a DNN is represented by probability measures and functions over its features in the continuous limit.
We illustrate the framework via the standard DNN and the Residual Network (Res-Net) architectures.
arXiv Detail & Related papers (2020-07-03T01:37:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.