Towards an Hybrid Hodgkin-Huxley Action Potential Generation Model
- URL: http://arxiv.org/abs/2304.01346v1
- Date: Wed, 15 Mar 2023 22:39:23 GMT
- Title: Towards an Hybrid Hodgkin-Huxley Action Potential Generation Model
- Authors: Lautaro Estienne
- Abstract summary: We investigate the possibility of finding the Hodgkin-Huxley model's parametric functions using only two simple measurements.
Experiments were carried out using data generated from the original Hodgkin-Huxley model.
Results show that a simple two-layer artificial neural network architecture trained on a minimal amount of data can learn to model some of the fundamental proprieties of the action potential generation.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Mathematical models for the generation of the action potential can improve
the understanding of physiological mechanisms that are consequence of the
electrical activity in neurons. In such models, some equations involving
empirically obtained functions of the membrane potential are usually defined.
The best known of these models, the Hodgkin-Huxley model, is an example of this
paradigm since it defines the conductances of ion channels in terms of the
opening and closing rates of each type of gate present in the channels. These
functions need to be derived from laboratory measurements that are often very
expensive and produce little data because they involve a time-space-independent
measurement of the voltage in a single channel of the cell membrane. In this
work, we investigate the possibility of finding the Hodgkin-Huxley model's
parametric functions using only two simple measurements (the membrane voltage
as a function of time and the injected current that triggered that voltage) and
applying Deep Learning methods to estimate these functions. This would result
in an hybrid model of the action potential generation composed by the original
Hodgkin-Huxley equations and an Artificial Neural Network that requires a small
set of easy-to-perform measurements to be trained. Experiments were carried out
using data generated from the original Hodgkin-Huxley model, and results show
that a simple two-layer artificial neural network (ANN) architecture trained on
a minimal amount of data can learn to model some of the fundamental proprieties
of the action potential generation by estimating the model's rate functions.
Related papers
- Langevin Flows for Modeling Neural Latent Dynamics [81.81271685018284]
We introduce LangevinFlow, a sequential Variational Auto-Encoder where the time evolution of latent variables is governed by the underdamped Langevin equation.<n>Our approach incorporates physical priors -- such as inertia, damping, a learned potential function, and forces -- to represent both autonomous and non-autonomous processes in neural systems.<n>Our method outperforms state-of-the-art baselines on synthetic neural populations generated by a Lorenz attractor.
arXiv Detail & Related papers (2025-07-15T17:57:48Z) - NOBLE -- Neural Operator with Biologically-informed Latent Embeddings to Capture Experimental Variability in Biological Neuron Models [68.89389652724378]
NOBLE is a neural operator framework that learns a mapping from a continuous frequency-modulated embedding of interpretable neuron features to the somatic voltage response induced by current injection.<n>It predicts distributions of neural dynamics accounting for the intrinsic experimental variability.<n>NOBLE is the first scaled-up deep learning framework validated on real experimental data.
arXiv Detail & Related papers (2025-06-05T01:01:18Z) - Modeling Neural Activity with Conditionally Linear Dynamical Systems [14.902340082626653]
We develop Conditionally Linear Dynamical System models as a general-purpose method to characterize these dynamics.
We find that CLDS models can perform well even in severely data-limited regimes.
In example applications, we apply CLDS to model thalamic neurons that nonlinearly encode heading direction and to model motor cortical neurons during a cued reaching task
arXiv Detail & Related papers (2025-02-25T16:36:24Z) - Discovering intrinsic multi-compartment pharmacometric models using Physics Informed Neural Networks [0.0]
We introduce PKINNs, a novel purely data-driven neural network model.
PKINNs efficiently discovers and models intrinsic multi-compartment-based pharmacometric structures.
The resulting models are both interpretable and explainable through Symbolic Regression methods.
arXiv Detail & Related papers (2024-04-30T19:31:31Z) - Molecule Design by Latent Prompt Transformer [76.2112075557233]
This work explores the challenging problem of molecule design by framing it as a conditional generative modeling task.
We propose a novel generative model comprising three components: (1) a latent vector with a learnable prior distribution; (2) a molecule generation model based on a causal Transformer, which uses the latent vector as a prompt; and (3) a property prediction model that predicts a molecule's target properties and/or constraint values using the latent prompt.
arXiv Detail & Related papers (2024-02-27T03:33:23Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Physics-constrained neural differential equations for learning
multi-ionic transport [0.0]
We develop the first physics-informed deep learning model to learn ion transport behaviour across polyamide nanopores.
We use neural differential equations in conjunction with classical closure models as inductive biases directly into the neural framework.
arXiv Detail & Related papers (2023-03-07T17:18:52Z) - Discovery of sparse hysteresis models for piezoelectric materials [1.3669389861593737]
This article presents an approach for modelling in piezoelectric materials using sparse-regression techniques.
The presented approach is compared to traditional regression-based and neural network methods, demonstrating its efficiency and robustness.
arXiv Detail & Related papers (2023-02-10T15:21:36Z) - Prediction of liquid fuel properties using machine learning models with
Gaussian processes and probabilistic conditional generative learning [56.67751936864119]
The present work aims to construct cheap-to-compute machine learning (ML) models to act as closure equations for predicting the physical properties of alternative fuels.
Those models can be trained using the database from MD simulations and/or experimental measurements in a data-fusion-fidelity approach.
The results show that ML models can predict accurately the fuel properties of a wide range of pressure and temperature conditions.
arXiv Detail & Related papers (2021-10-18T14:43:50Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - The Neural Coding Framework for Learning Generative Models [91.0357317238509]
We propose a novel neural generative model inspired by the theory of predictive processing in the brain.
In a similar way, artificial neurons in our generative model predict what neighboring neurons will do, and adjust their parameters based on how well the predictions matched reality.
arXiv Detail & Related papers (2020-12-07T01:20:38Z) - Discovery of the Hidden State in Ionic Models Using a Domain-Specific
Recurrent Neural Network [0.0]
We describe a recurrent neural network architecture designed specifically to encode ionic models.
The network is trained in two steps: first, it learns the theoretical model coded in a set of ODEs, and second, it is retrained on experimental data.
We tested the GNN networks using simulated ventricular action potential signals and showed that it could deduce physiologically-feasible alterations of ionic currents.
arXiv Detail & Related papers (2020-11-14T21:13:41Z) - Supervised Autoencoders Learn Robust Joint Factor Models of Neural
Activity [2.8402080392117752]
neuroscience applications collect high-dimensional predictors' corresponding to brain activity in different regions along with behavioral outcomes.
Joint factor models for the predictors and outcomes are natural, but maximum likelihood estimates of these models can struggle in practice when there is model misspecification.
We propose an alternative inference strategy based on supervised autoencoders; rather than placing a probability distribution on the latent factors, we define them as an unknown function of the high-dimensional predictors.
arXiv Detail & Related papers (2020-04-10T19:31:57Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.