Amortized Inference of Neuron Parameters on Analog Neuromorphic Hardware
- URL: http://arxiv.org/abs/2602.10763v2
- Date: Thu, 12 Feb 2026 15:05:37 GMT
- Title: Amortized Inference of Neuron Parameters on Analog Neuromorphic Hardware
- Authors: Jakob Kaiser, Eric Müller, Johannes Schemmel,
- Abstract summary: Our work utilized a non-sequential simulation-based inference algorithm to provide an amortized neural density estimator.<n>We constrained the large parameter space by training a binary classifier to predict parameter combinations yielding observations in regimes of interest.<n>Our results validate amortized simulation-based inference as a tool for parameterizing analog neuron circuits.
- Score: 1.4777635405760978
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Our work utilized a non-sequential simulation-based inference algorithm to provide an amortized neural density estimator, which approximates the posterior distribution for seven parameters of the adaptive exponential integrate-and-fire neuron model of the analog neuromorphic BrainScaleS-2 substrate. We constrained the large parameter space by training a binary classifier to predict parameter combinations yielding observations in regimes of interest, i.e. moderate spike counts. We compared two neural density estimators: one using handcrafted summary statistics and one using a summary network trained in combination with the neural density estimator. The summary network yielded a more focused posterior and generated posterior predictive traces that accurately captured the membrane potential dynamics. When using handcrafted summary statistics, posterior predictive traces match the included features but show deviations in the exact dynamics. The posteriors showed signs of bias and miscalibration but were still able to yield posterior predictive samples that were close to the target observations on which the posteriors were constrained. Our results validate amortized simulation-based inference as a tool for parameterizing analog neuron circuits.
Related papers
- A Simple Approximate Bayesian Inference Neural Surrogate for Stochastic Petri Net Models [0.0]
We introduce a neural-network-based approximation of the posterior distribution framework.<n>Our model employs a lightweight 1D Convolutional Residual Network trained end-to-end on Gillespie-simulated SPN realizations.<n>On synthetic SPNs with 20% missing events, our surrogate recovers rate-function coefficients with an RMSE = 0.108 and substantially runs faster than traditional Bayesian approaches.
arXiv Detail & Related papers (2025-07-14T18:31:19Z) - Deep learning with missing data [3.829599191332801]
We propose Pattern Embedded Neural Networks (PENNs), which can be applied in conjunction with any existing imputation technique.<n>In addition to a neural network trained on the imputed data, PENNs pass the vectors of observation indicators through a second neural network to provide a compact representation.<n>The outputs are then combined in a third neural network to produce final predictions.
arXiv Detail & Related papers (2025-04-21T18:57:36Z) - Reproduction of AdEx dynamics on neuromorphic hardware through data embedding and simulation-based inference [0.8437187555622164]
We use an autoencoder to automatically extract relevant features from the membrane trace of a complex neuron model.<n>We then leverage sequential neural posterior estimation to approximate the posterior distribution of neuron parameters.<n>This suggests that the combination of an autoencoder with the SNPE algorithm is a promising optimization method for complex systems.
arXiv Detail & Related papers (2024-12-03T13:19:21Z) - A variational neural Bayes framework for inference on intractable posterior distributions [1.0801976288811024]
Posterior distributions of model parameters are efficiently obtained by feeding observed data into a trained neural network.
We show theoretically that our posteriors converge to the true posteriors in Kullback-Leibler divergence.
arXiv Detail & Related papers (2024-04-16T20:40:15Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Simulation-based Inference for Model Parameterization on Analog
Neuromorphic Hardware [1.843681725117436]
This study investigates the suitability of the sequential neural posterior estimation algorithm for parameterizing a multi-compartmental neuron model emulated on the BrainScaleS-2 system.
The SNPE algorithm belongs to the class of simulation-based inference methods and estimates the posterior distribution of the model parameters.
arXiv Detail & Related papers (2023-03-28T15:37:30Z) - Neural Posterior Estimation with Differentiable Simulators [58.720142291102135]
We present a new method to perform Neural Posterior Estimation (NPE) with a differentiable simulator.
We demonstrate how gradient information helps constrain the shape of the posterior and improves sample-efficiency.
arXiv Detail & Related papers (2022-07-12T16:08:04Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - The Neural Coding Framework for Learning Generative Models [91.0357317238509]
We propose a novel neural generative model inspired by the theory of predictive processing in the brain.
In a similar way, artificial neurons in our generative model predict what neighboring neurons will do, and adjust their parameters based on how well the predictions matched reality.
arXiv Detail & Related papers (2020-12-07T01:20:38Z) - A Bayesian Perspective on Training Speed and Model Selection [51.15664724311443]
We show that a measure of a model's training speed can be used to estimate its marginal likelihood.
We verify our results in model selection tasks for linear models and for the infinite-width limit of deep neural networks.
Our results suggest a promising new direction towards explaining why neural networks trained with gradient descent are biased towards functions that generalize well.
arXiv Detail & Related papers (2020-10-27T17:56:14Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.