Non-parametric generalized linear model
- URL: http://arxiv.org/abs/2009.01362v1
- Date: Wed, 2 Sep 2020 21:54:53 GMT
- Title: Non-parametric generalized linear model
- Authors: Matthew Dowling, Yuan Zhao, Il Memming Park
- Abstract summary: A fundamental problem in statistical neuroscience is to model how neurons encode information by analyzing electrophysiological recordings.
A popular and widely-used approach is to fit the spike trains with an autoregressive point process model.
In practice a sufficiently rich but small ensemble of temporal basis functions needs to be chosen to parameterize the filters.
- Score: 7.936841911281107
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A fundamental problem in statistical neuroscience is to model how neurons
encode information by analyzing electrophysiological recordings. A popular and
widely-used approach is to fit the spike trains with an autoregressive point
process model. These models are characterized by a set of convolutional
temporal filters, whose subsequent analysis can help reveal how neurons encode
stimuli, interact with each other, and process information. In practice a
sufficiently rich but small ensemble of temporal basis functions needs to be
chosen to parameterize the filters. However, obtaining a satisfactory fit often
requires burdensome model selection and fine tuning the form of the basis
functions and their temporal span. In this paper we propose a nonparametric
approach for jointly inferring the filters and hyperparameters using the
Gaussian process framework. Our method is computationally efficient taking
advantage of the sparse variational approximation while being flexible and rich
enough to characterize arbitrary filters in continuous time lag. Moreover, our
method automatically learns the temporal span of the filter. For the particular
application in neuroscience, we designed priors for stimulus and history
filters useful for the spike trains. We compare and validate our method on
simulated and real neural spike train data.
Related papers
- Neuronal Temporal Filters as Normal Mode Extractors [11.075435272349862]
We consider how prediction may lie at the core of brain function by considering a neuron predicting the future of a scalar time series input.
We mathematically analyze the operation of such an algorithm on noisy observations of synthetic data generated by a linear system.
arXiv Detail & Related papers (2024-01-06T16:10:02Z) - WaLiN-GUI: a graphical and auditory tool for neuron-based encoding [73.88751967207419]
Neuromorphic computing relies on spike-based, energy-efficient communication.
We develop a tool to identify suitable configurations for neuron-based encoding of sample-based data into spike trains.
The WaLiN-GUI is provided open source and with documentation.
arXiv Detail & Related papers (2023-10-25T20:34:08Z) - Temporal Conditioning Spiking Latent Variable Models of the Neural
Response to Natural Visual Scenes [29.592870472342337]
This work presents the temporal conditioning spiking latent variable models (TeCoS-LVM) to simulate the neural response to natural visual stimuli.
We use spiking neurons to produce spike outputs that directly match the recorded trains.
We show that TeCoS-LVM models can produce more realistic spike activities and accurately fit spike statistics than powerful alternatives.
arXiv Detail & Related papers (2023-06-21T06:30:18Z) - Low-rank extended Kalman filtering for online learning of neural
networks from streaming data [71.97861600347959]
We propose an efficient online approximate Bayesian inference algorithm for estimating the parameters of a nonlinear function from a potentially non-stationary data stream.
The method is based on the extended Kalman filter (EKF), but uses a novel low-rank plus diagonal decomposition of the posterior matrix.
In contrast to methods based on variational inference, our method is fully deterministic, and does not require step-size tuning.
arXiv Detail & Related papers (2023-05-31T03:48:49Z) - Neuronal architecture extracts statistical temporal patterns [1.9662978733004601]
We show how higher-order temporal (co-)fluctuations can be employed to represent and process information.
A simple biologically inspired feedforward neuronal model is able to extract information from up to the third order cumulant to perform time series classification.
arXiv Detail & Related papers (2023-01-24T18:21:33Z) - FaDIn: Fast Discretized Inference for Hawkes Processes with General
Parametric Kernels [82.53569355337586]
This work offers an efficient solution to temporal point processes inference using general parametric kernels with finite support.
The method's effectiveness is evaluated by modeling the occurrence of stimuli-induced patterns from brain signals recorded with magnetoencephalography (MEG)
Results show that the proposed approach leads to an improved estimation of pattern latency than the state-of-the-art.
arXiv Detail & Related papers (2022-10-10T12:35:02Z) - Neural parameter calibration for large-scale multi-agent models [0.7734726150561089]
We present a method to retrieve accurate probability densities for parameters using neural equations.
The two combined create a powerful tool that can quickly estimate densities on model parameters, even for very large systems.
arXiv Detail & Related papers (2022-09-27T17:36:26Z) - Deep Learning for the Benes Filter [91.3755431537592]
We present a new numerical method based on the mesh-free neural network representation of the density of the solution of the Benes model.
We discuss the role of nonlinearity in the filtering model equations for the choice of the domain of the neural network.
arXiv Detail & Related papers (2022-03-09T14:08:38Z) - Deep Metric Learning with Locality Sensitive Angular Loss for
Self-Correcting Source Separation of Neural Spiking Signals [77.34726150561087]
We propose a methodology based on deep metric learning to address the need for automated post-hoc cleaning and robust separation filters.
We validate this method with an artificially corrupted label set based on source-separated high-density surface electromyography recordings.
This approach enables a neural network to learn to accurately decode neurophysiological time series using any imperfect method of labelling the signal.
arXiv Detail & Related papers (2021-10-13T21:51:56Z) - Adaptive conversion of real-valued input into spike trains [91.3755431537592]
This paper presents a biologically plausible method for converting real-valued input into spike trains for processing with spiking neural networks.
The proposed method mimics the adaptive behaviour of retinal ganglion cells and allows input neurons to adapt their response to changes in the statistics of the input.
arXiv Detail & Related papers (2021-04-12T12:33:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.