Autonomous learning of nonlocal stochastic neuron dynamics
- URL: http://arxiv.org/abs/2011.10955v2
- Date: Tue, 7 Sep 2021 19:13:34 GMT
- Title: Autonomous learning of nonlocal stochastic neuron dynamics
- Authors: Tyler E. Maltba (1), Hongli Zhao (1), Daniel M. Tartakovsky (2) ((1)
UC Berkeley, (2) Stanford University)
- Abstract summary: Neuronal dynamics is driven by externally imposed or internally generated random excitations/noise, and is often described by systems of random or ordinary differential equations.
It can be used to calculate such information-theoretic quantities as the mutual information between the stimulus and various internal states of the neuron.
We propose two methods for closing such equations: a modified nonlocal large-diffusivity closure and a dataeddy closure relying on sparse regression to learn relevant features.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neuronal dynamics is driven by externally imposed or internally generated
random excitations/noise, and is often described by systems of random or
stochastic ordinary differential equations. Such systems admit a distribution
of solutions, which is (partially) characterized by the single-time joint
probability density function (PDF) of system states. It can be used to
calculate such information-theoretic quantities as the mutual information
between the stochastic stimulus and various internal states of the neuron
(e.g., membrane potential), as well as various spiking statistics. When random
excitations are modeled as Gaussian white noise, the joint PDF of neuron states
satisfies exactly a Fokker-Planck equation. However, most biologically
plausible noise sources are correlated (colored). In this case, the resulting
PDF equations require a closure approximation. We propose two methods for
closing such equations: a modified nonlocal large-eddy-diffusivity closure and
a data-driven closure relying on sparse regression to learn relevant features.
The closures are tested for the stochastic non-spiking leaky integrate-and-fire
and FitzHugh-Nagumo (FHN) neurons driven by sine-Wiener noise. Mutual
information and total correlation between the random stimulus and the internal
states of the neuron are calculated for the FHN neuron.
Related papers
- Confidence Regulation Neurons in Language Models [91.90337752432075]
This study investigates the mechanisms by which large language models represent and regulate uncertainty in next-token predictions.
Entropy neurons are characterized by an unusually high weight norm and influence the final layer normalization (LayerNorm) scale to effectively scale down the logits.
token frequency neurons, which we describe here for the first time, boost or suppress each token's logit proportionally to its log frequency, thereby shifting the output distribution towards or away from the unigram distribution.
arXiv Detail & Related papers (2024-06-24T01:31:03Z) - Diffusion-based Quantum Error Mitigation using Stochastic Differential Equation [9.913187216180424]
The random fluctuations that arise due to the interaction with the external environment cause noise affecting the states of the quantum system, resulting in system errors.
This paper introduces a novel approach to mitigate errors using diffusion models.
This approach can be realized by noise occurrence formulation during the state evolution as forward-backward differential equations (FBSDE) and adapting the score-based generative model (SGM) to denoise errors in quantum states.
arXiv Detail & Related papers (2024-05-23T07:59:26Z) - Weak Collocation Regression for Inferring Stochastic Dynamics with
L\'{e}vy Noise [8.15076267771005]
We propose a weak form of the Fokker-Planck (FP) equation for extracting dynamics with L'evy noise.
Our approach can simultaneously distinguish mixed noise types, even in multi-dimensional problems.
arXiv Detail & Related papers (2024-03-13T06:54:38Z) - Inferring Inference [7.11780383076327]
We develop a framework for inferring canonical distributed computations from large-scale neural activity patterns.
We simulate recordings for a model brain that implicitly implements an approximate inference algorithm on a probabilistic graphical model.
Overall, this framework provides a new tool for discovering interpretable structure in neural recordings.
arXiv Detail & Related papers (2023-10-04T22:12:11Z) - Machine learning in and out of equilibrium [58.88325379746631]
Our study uses a Fokker-Planck approach, adapted from statistical physics, to explore these parallels.
We focus in particular on the stationary state of the system in the long-time limit, which in conventional SGD is out of equilibrium.
We propose a new variation of Langevin dynamics (SGLD) that harnesses without replacement minibatching.
arXiv Detail & Related papers (2023-06-06T09:12:49Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Neural Stochastic Partial Differential Equations [1.2183405753834562]
We introduce the Neural SPDE model providing an extension to two important classes of physics-inspired neural architectures.
On the one hand, it extends all the popular neural -- ordinary, controlled, rough -- differential equation models in that it is capable of processing incoming information.
On the other hand, it extends Neural Operators -- recent generalizations of neural networks modelling mappings between functional spaces -- in that it can be used to learn complex SPDE solution operators.
arXiv Detail & Related papers (2021-10-19T20:35:37Z) - The Separation Capacity of Random Neural Networks [78.25060223808936]
We show that a sufficiently large two-layer ReLU-network with standard Gaussian weights and uniformly distributed biases can solve this problem with high probability.
We quantify the relevant structure of the data in terms of a novel notion of mutual complexity.
arXiv Detail & Related papers (2021-07-31T10:25:26Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Optimal Learning with Excitatory and Inhibitory synapses [91.3755431537592]
I study the problem of storing associations between analog signals in the presence of correlations.
I characterize the typical learning performance in terms of the power spectrum of random input and output processes.
arXiv Detail & Related papers (2020-05-25T18:25:54Z) - Neural network representation of the probability density function of
diffusion processes [0.0]
Physics-informed neural networks are developed to characterize the state of dynamical systems in a random environment.
We examine analytically and numerically the advantages and disadvantages of solving each type of differential equation to characterize the state.
arXiv Detail & Related papers (2020-01-15T17:15:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.