Physically constrained neural networks to solve the inverse problem for
neuron models
- URL: http://arxiv.org/abs/2209.11998v1
- Date: Sat, 24 Sep 2022 12:51:15 GMT
- Title: Physically constrained neural networks to solve the inverse problem for
neuron models
- Authors: Matteo Ferrante, Andera Duggento, Nicola Toschi
- Abstract summary: Systems biology and systems neurophysiology are powerful tools for a number of key applications in the biomedical sciences.
Recent developments in the field of deep neural networks have demonstrated the possibility of formulating nonlinear, universal approximators.
- Score: 0.29005223064604074
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Systems biology and systems neurophysiology in particular have recently
emerged as powerful tools for a number of key applications in the biomedical
sciences. Nevertheless, such models are often based on complex combinations of
multiscale (and possibly multiphysics) strategies that require ad hoc
computational strategies and pose extremely high computational demands. Recent
developments in the field of deep neural networks have demonstrated the
possibility of formulating nonlinear, universal approximators to estimate
solutions to highly nonlinear and complex problems with significant speed and
accuracy advantages in comparison with traditional models. After synthetic data
validation, we use so-called physically constrained neural networks (PINN) to
simultaneously solve the biologically plausible Hodgkin-Huxley model and infer
its parameters and hidden time-courses from real data under both variable and
constant current stimulation, demonstrating extremely low variability across
spikes and faithful signal reconstruction. The parameter ranges we obtain are
also compatible with prior knowledge. We demonstrate that detailed biological
knowledge can be provided to a neural network, making it able to fit complex
dynamics over both simulated and real data.
Related papers
- Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Exploring Biological Neuronal Correlations with Quantum Generative Models [0.0]
We introduce a quantum generative model framework for generating synthetic data that captures the spatial and temporal correlations of biological neuronal activity.
Our model demonstrates the ability to achieve reliable outcomes with fewer trainable parameters compared to classical methods.
arXiv Detail & Related papers (2024-09-13T18:00:06Z) - A frugal Spiking Neural Network for unsupervised classification of continuous multivariate temporal data [0.0]
Spiking Neural Networks (SNNs) are neuromorphic and use more biologically plausible neurons with evolving membrane potentials.
We introduce here a frugal single-layer SNN designed for fully unsupervised identification and classification of multivariate temporal patterns in continuous data.
arXiv Detail & Related papers (2024-08-08T08:15:51Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - Predicting Biomedical Interactions with Probabilistic Model Selection
for Graph Neural Networks [5.156812030122437]
Current biological networks are noisy, sparse, and incomplete. Experimental identification of such interactions is both time-consuming and expensive.
Deep graph neural networks have shown their effectiveness in modeling graph-structured data and achieved good performance in biomedical interaction prediction.
Our proposed method enables the graph convolutional networks to dynamically adapt their depths to accommodate an increasing number of interactions.
arXiv Detail & Related papers (2022-11-22T20:44:28Z) - Impact of spiking neurons leakages and network recurrences on
event-based spatio-temporal pattern recognition [0.0]
Spiking neural networks coupled with neuromorphic hardware and event-based sensors are getting increased interest for low-latency and low-power inference at the edge.
We explore the impact of synaptic and membrane leakages in spiking neurons.
arXiv Detail & Related papers (2022-11-14T21:34:02Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - Bayesian Physics-Informed Neural Networks for real-world nonlinear
dynamical systems [0.0]
We integrate data, physics, and uncertainties by combining neural networks, physics-informed modeling, and Bayesian inference.
Our study reveals the inherent advantages and disadvantages of Neural Networks, Bayesian Inference, and a combination of both.
We anticipate that the underlying concepts and trends generalize to more complex disease conditions.
arXiv Detail & Related papers (2022-05-12T19:04:31Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - POPPINS : A Population-Based Digital Spiking Neuromorphic Processor with
Integer Quadratic Integrate-and-Fire Neurons [50.591267188664666]
We propose a population-based digital spiking neuromorphic processor in 180nm process technology with two hierarchy populations.
The proposed approach enables the developments of biomimetic neuromorphic system and various low-power, and low-latency inference processing applications.
arXiv Detail & Related papers (2022-01-19T09:26:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.