Towards an Hybrid Hodgkin-Huxley Action Potential Generation Model
- URL: http://arxiv.org/abs/2304.01346v1
- Date: Wed, 15 Mar 2023 22:39:23 GMT
- Title: Towards an Hybrid Hodgkin-Huxley Action Potential Generation Model
- Authors: Lautaro Estienne
- Abstract summary: We investigate the possibility of finding the Hodgkin-Huxley model's parametric functions using only two simple measurements.
Experiments were carried out using data generated from the original Hodgkin-Huxley model.
Results show that a simple two-layer artificial neural network architecture trained on a minimal amount of data can learn to model some of the fundamental proprieties of the action potential generation.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Mathematical models for the generation of the action potential can improve
the understanding of physiological mechanisms that are consequence of the
electrical activity in neurons. In such models, some equations involving
empirically obtained functions of the membrane potential are usually defined.
The best known of these models, the Hodgkin-Huxley model, is an example of this
paradigm since it defines the conductances of ion channels in terms of the
opening and closing rates of each type of gate present in the channels. These
functions need to be derived from laboratory measurements that are often very
expensive and produce little data because they involve a time-space-independent
measurement of the voltage in a single channel of the cell membrane. In this
work, we investigate the possibility of finding the Hodgkin-Huxley model's
parametric functions using only two simple measurements (the membrane voltage
as a function of time and the injected current that triggered that voltage) and
applying Deep Learning methods to estimate these functions. This would result
in an hybrid model of the action potential generation composed by the original
Hodgkin-Huxley equations and an Artificial Neural Network that requires a small
set of easy-to-perform measurements to be trained. Experiments were carried out
using data generated from the original Hodgkin-Huxley model, and results show
that a simple two-layer artificial neural network (ANN) architecture trained on
a minimal amount of data can learn to model some of the fundamental proprieties
of the action potential generation by estimating the model's rate functions.
Related papers
- Learning to Generate Lumped Hydrological Models [4.368211287521716]
In this study, a generative model was learned from data from over 3,000 catchments worldwide.
The model was then used to derive optimal modeling functions for over 700 different catchments.
Overall, this study demonstrates that the hydrological behavior of a catchment can be effectively described using a small number of latent variables.
arXiv Detail & Related papers (2023-09-18T16:07:41Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Physics-constrained neural differential equations for learning
multi-ionic transport [0.0]
We develop the first physics-informed deep learning model to learn ion transport behaviour across polyamide nanopores.
We use neural differential equations in conjunction with classical closure models as inductive biases directly into the neural framework.
arXiv Detail & Related papers (2023-03-07T17:18:52Z) - Discovery of sparse hysteresis models for piezoelectric materials [1.3669389861593737]
This article presents an approach for modelling in piezoelectric materials using sparse-regression techniques.
The presented approach is compared to traditional regression-based and neural network methods, demonstrating its efficiency and robustness.
arXiv Detail & Related papers (2023-02-10T15:21:36Z) - Prediction of liquid fuel properties using machine learning models with
Gaussian processes and probabilistic conditional generative learning [56.67751936864119]
The present work aims to construct cheap-to-compute machine learning (ML) models to act as closure equations for predicting the physical properties of alternative fuels.
Those models can be trained using the database from MD simulations and/or experimental measurements in a data-fusion-fidelity approach.
The results show that ML models can predict accurately the fuel properties of a wide range of pressure and temperature conditions.
arXiv Detail & Related papers (2021-10-18T14:43:50Z) - Going Beyond Linear RL: Sample Efficient Neural Function Approximation [76.57464214864756]
We study function approximation with two-layer neural networks.
Our results significantly improve upon what can be attained with linear (or eluder dimension) methods.
arXiv Detail & Related papers (2021-07-14T03:03:56Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - The Neural Coding Framework for Learning Generative Models [91.0357317238509]
We propose a novel neural generative model inspired by the theory of predictive processing in the brain.
In a similar way, artificial neurons in our generative model predict what neighboring neurons will do, and adjust their parameters based on how well the predictions matched reality.
arXiv Detail & Related papers (2020-12-07T01:20:38Z) - Discovery of the Hidden State in Ionic Models Using a Domain-Specific
Recurrent Neural Network [0.0]
We describe a recurrent neural network architecture designed specifically to encode ionic models.
The network is trained in two steps: first, it learns the theoretical model coded in a set of ODEs, and second, it is retrained on experimental data.
We tested the GNN networks using simulated ventricular action potential signals and showed that it could deduce physiologically-feasible alterations of ionic currents.
arXiv Detail & Related papers (2020-11-14T21:13:41Z) - Supervised Autoencoders Learn Robust Joint Factor Models of Neural
Activity [2.8402080392117752]
neuroscience applications collect high-dimensional predictors' corresponding to brain activity in different regions along with behavioral outcomes.
Joint factor models for the predictors and outcomes are natural, but maximum likelihood estimates of these models can struggle in practice when there is model misspecification.
We propose an alternative inference strategy based on supervised autoencoders; rather than placing a probability distribution on the latent factors, we define them as an unknown function of the high-dimensional predictors.
arXiv Detail & Related papers (2020-04-10T19:31:57Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.