Simulation-based Inference for Model Parameterization on Analog
Neuromorphic Hardware
- URL: http://arxiv.org/abs/2303.16056v2
- Date: Mon, 20 Nov 2023 09:55:44 GMT
- Title: Simulation-based Inference for Model Parameterization on Analog
Neuromorphic Hardware
- Authors: Jakob Kaiser, Raphael Stock, Eric M\"uller, Johannes Schemmel,
Sebastian Schmitt
- Abstract summary: This study investigates the suitability of the sequential neural posterior estimation algorithm for parameterizing a multi-compartmental neuron model emulated on the BrainScaleS-2 system.
The SNPE algorithm belongs to the class of simulation-based inference methods and estimates the posterior distribution of the model parameters.
- Score: 1.843681725117436
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: The BrainScaleS-2 (BSS-2) system implements physical models of neurons as
well as synapses and aims for an energy-efficient and fast emulation of
biological neurons. When replicating neuroscientific experiments on BSS-2, a
major challenge is finding suitable model parameters. This study investigates
the suitability of the sequential neural posterior estimation (SNPE) algorithm
for parameterizing a multi-compartmental neuron model emulated on the BSS-2
analog neuromorphic system. The SNPE algorithm belongs to the class of
simulation-based inference methods and estimates the posterior distribution of
the model parameters; access to the posterior allows quantifying the confidence
in parameter estimations and unveiling correlation between model parameters.
For our multi-compartmental model, we show that the approximated posterior
agrees with experimental observations and that the identified correlation
between parameters fits theoretical expectations. Furthermore, as already shown
for software simulations, the algorithm can deal with high-dimensional
observations and parameter spaces when the data is generated by emulations on
BSS-2. These results suggest that the SNPE algorithm is a promising approach
for automating the parameterization and the analyzation of complex models,
especially when dealing with characteristic properties of analog neuromorphic
substrates, such as trial-to-trial variations or limited parameter ranges.
Related papers
- Diffusion posterior sampling for simulation-based inference in tall data settings [53.17563688225137]
Simulation-based inference ( SBI) is capable of approximating the posterior distribution that relates input parameters to a given observation.
In this work, we consider a tall data extension in which multiple observations are available to better infer the parameters of the model.
We compare our method to recently proposed competing approaches on various numerical experiments and demonstrate its superiority in terms of numerical stability and computational cost.
arXiv Detail & Related papers (2024-04-11T09:23:36Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Supervised Parameter Estimation of Neuron Populations from Multiple
Firing Events [3.2826301276626273]
We study an automatic approach of learning the parameters of neuron populations from a training set consisting of pairs of spiking series and parameter labels via supervised learning.
We simulate many neuronal populations at computation at different parameter settings using a neuron model.
We then compare their performance against classical approaches including a genetic search, Bayesian sequential estimation, and a random walk approximate model.
arXiv Detail & Related papers (2022-10-02T03:17:05Z) - On the Influence of Enforcing Model Identifiability on Learning dynamics
of Gaussian Mixture Models [14.759688428864159]
We propose a technique for extracting submodels from singular models.
Our method enforces model identifiability during training.
We show how the method can be applied to more complex models like deep neural networks.
arXiv Detail & Related papers (2022-06-17T07:50:22Z) - Mixed Effects Neural ODE: A Variational Approximation for Analyzing the
Dynamics of Panel Data [50.23363975709122]
We propose a probabilistic model called ME-NODE to incorporate (fixed + random) mixed effects for analyzing panel data.
We show that our model can be derived using smooth approximations of SDEs provided by the Wong-Zakai theorem.
We then derive Evidence Based Lower Bounds for ME-NODE, and develop (efficient) training algorithms.
arXiv Detail & Related papers (2022-02-18T22:41:51Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - Combining data assimilation and machine learning to estimate parameters
of a convective-scale model [0.0]
Errors in the representation of clouds in convection-permitting numerical weather prediction models can be introduced by different sources.
In this work, we look at the problem of parameter estimation through an artificial intelligence lens by training two types of artificial neural networks.
arXiv Detail & Related papers (2021-09-07T09:17:29Z) - Parameter Estimation with Dense and Convolutional Neural Networks
Applied to the FitzHugh-Nagumo ODE [0.0]
We present deep neural networks using dense and convolutional layers to solve an inverse problem, where we seek to estimate parameters of a Fitz-Nagumo model.
We demonstrate that deep neural networks have the potential to estimate parameters in dynamical models and processes, and they are capable of predicting parameters accurately for the framework.
arXiv Detail & Related papers (2020-12-12T01:20:42Z) - On the Sparsity of Neural Machine Translation Models [65.49762428553345]
We investigate whether redundant parameters can be reused to achieve better performance.
Experiments and analyses are systematically conducted on different datasets and NMT architectures.
arXiv Detail & Related papers (2020-10-06T11:47:20Z) - Novel and flexible parameter estimation methods for data-consistent
inversion in mechanistic modeling [0.13635858675752988]
We introduce new methods to solve inverse problems (SIP) based on rejection sampling, Markov chain Monte Carlo, and generative adversarial networks (GANs)
To overcome limitations of SIP, we reformulate SIP based on constrained optimization and present a novel GAN to solve the constrained optimization problem.
arXiv Detail & Related papers (2020-09-17T13:13:21Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.