Neuronal Temporal Filters as Normal Mode Extractors
- URL: http://arxiv.org/abs/2401.03248v1
- Date: Sat, 6 Jan 2024 16:10:02 GMT
- Title: Neuronal Temporal Filters as Normal Mode Extractors
- Authors: Siavash Golkar, Jules Berman, David Lipshutz, Robert Mihai Haret, Tim
Gollisch, and Dmitri B. Chklovskii
- Abstract summary: We consider how prediction may lie at the core of brain function by considering a neuron predicting the future of a scalar time series input.
We mathematically analyze the operation of such an algorithm on noisy observations of synthetic data generated by a linear system.
- Score: 11.075435272349862
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: To generate actions in the face of physiological delays, the brain must
predict the future. Here we explore how prediction may lie at the core of brain
function by considering a neuron predicting the future of a scalar time series
input. Assuming that the dynamics of the lag vector (a vector composed of
several consecutive elements of the time series) are locally linear, Normal
Mode Decomposition decomposes the dynamics into independently evolving
(eigen-)modes allowing for straightforward prediction. We propose that a neuron
learns the top mode and projects its input onto the associated subspace. Under
this interpretation, the temporal filter of a neuron corresponds to the left
eigenvector of a generalized eigenvalue problem. We mathematically analyze the
operation of such an algorithm on noisy observations of synthetic data
generated by a linear system. Interestingly, the shape of the temporal filter
varies with the signal-to-noise ratio (SNR): a noisy input yields a monophasic
filter and a growing SNR leads to multiphasic filters with progressively
greater number of phases. Such variation in the temporal filter with input SNR
resembles that observed experimentally in biological neurons.
Related papers
- Continuous time recurrent neural networks: overview and application to
forecasting blood glucose in the intensive care unit [56.801856519460465]
Continuous time autoregressive recurrent neural networks (CTRNNs) are a deep learning model that account for irregular observations.
We demonstrate the application of these models to probabilistic forecasting of blood glucose in a critical care setting.
arXiv Detail & Related papers (2023-04-14T09:39:06Z) - A predictive physics-aware hybrid reduced order model for reacting flows [65.73506571113623]
A new hybrid predictive Reduced Order Model (ROM) is proposed to solve reacting flow problems.
The number of degrees of freedom is reduced from thousands of temporal points to a few POD modes with their corresponding temporal coefficients.
Two different deep learning architectures have been tested to predict the temporal coefficients.
arXiv Detail & Related papers (2023-01-24T08:39:20Z) - An advanced spatio-temporal convolutional recurrent neural network for
storm surge predictions [73.4962254843935]
We study the capability of artificial neural network models to emulate storm surge based on the storm track/size/intensity history.
This study presents a neural network model that can predict storm surge, informed by a database of synthetic storm simulations.
arXiv Detail & Related papers (2022-04-18T23:42:18Z) - Deep Learning for the Benes Filter [91.3755431537592]
We present a new numerical method based on the mesh-free neural network representation of the density of the solution of the Benes model.
We discuss the role of nonlinearity in the filtering model equations for the choice of the domain of the neural network.
arXiv Detail & Related papers (2022-03-09T14:08:38Z) - Sparsification and Filtering for Spatial-temporal GNN in Multivariate
Time-series [0.0]
We propose an end-to-end architecture for multivariate time-series prediction that integrates a spatial-temporal graph neural network with a matrix filtering module.
This module generates filtered (inverse) correlation graphs from multivariate time series before inputting them into a GNN.
In contrast with existing sparsification methods adopted in graph neural network, our model explicitly leverage time-series filtering to overcome the low signal-to-noise ratio typical of complex systems data.
arXiv Detail & Related papers (2022-03-08T10:44:30Z) - Efficient Neuromorphic Signal Processing with Loihi 2 [6.32784133039548]
We show how Resonate-and-Firetemporal (RF) neurons can be used to compute the Short Time Fourier Transform (STFT) with similar computational complexity but 47x less output bandwidth than the conventional STFT.
We also demonstrate promising preliminary results using backpropagation to train RF neurons for audio classification tasks.
arXiv Detail & Related papers (2021-11-05T22:37:05Z) - Spatiotemporal Spike-Pattern Selectivity in Single Mixed-Signal Neurons
with Balanced Synapses [0.27998963147546135]
Mixed-signal neuromorphic processors could be used for inference and learning.
We show how inhomogeneous synaptic circuits could be utilized for resource-efficient implementation of network layers.
arXiv Detail & Related papers (2021-06-10T12:04:03Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Non-parametric generalized linear model [7.936841911281107]
A fundamental problem in statistical neuroscience is to model how neurons encode information by analyzing electrophysiological recordings.
A popular and widely-used approach is to fit the spike trains with an autoregressive point process model.
In practice a sufficiently rich but small ensemble of temporal basis functions needs to be chosen to parameterize the filters.
arXiv Detail & Related papers (2020-09-02T21:54:53Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Exploiting Neuron and Synapse Filter Dynamics in Spatial Temporal
Learning of Deep Spiking Neural Network [7.503685643036081]
A bio-plausible SNN model with spatial-temporal property is a complex dynamic system.
We formulate SNN as a network of infinite impulse response (IIR) filters with neuron nonlinearity.
We propose a training algorithm that is capable to learn spatial-temporal patterns by searching for the optimal synapse filter kernels and weights.
arXiv Detail & Related papers (2020-02-19T01:27:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.