Bubblewrap: Online tiling and real-time flow prediction on neural
manifolds
- URL: http://arxiv.org/abs/2108.13941v1
- Date: Tue, 31 Aug 2021 16:01:45 GMT
- Title: Bubblewrap: Online tiling and real-time flow prediction on neural
manifolds
- Authors: Anne Draelos, Pranjal Gupta, Na Young Jun, Chaichontat Sriworarat,
John Pearson
- Abstract summary: We propose a method that combines fast, stable dimensionality reduction with a soft tiling of the resulting neural manifold.
The resulting model can be trained at kiloHertz data rates, produces accurate approximations of neural dynamics within minutes, and generates predictions on submillisecond time scales.
- Score: 2.624902795082451
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: While most classic studies of function in experimental neuroscience have
focused on the coding properties of individual neurons, recent developments in
recording technologies have resulted in an increasing emphasis on the dynamics
of neural populations. This has given rise to a wide variety of models for
analyzing population activity in relation to experimental variables, but direct
testing of many neural population hypotheses requires intervening in the system
based on current neural state, necessitating models capable of inferring neural
state online. Existing approaches, primarily based on dynamical systems,
require strong parametric assumptions that are easily violated in the
noise-dominated regime and do not scale well to the thousands of data channels
in modern experiments. To address this problem, we propose a method that
combines fast, stable dimensionality reduction with a soft tiling of the
resulting neural manifold, allowing dynamics to be approximated as a
probability flow between tiles. This method can be fit efficiently using online
expectation maximization, scales to tens of thousands of tiles, and outperforms
existing methods when dynamics are noise-dominated or feature multi-modal
transition probabilities. The resulting model can be trained at kiloHertz data
rates, produces accurate approximations of neural dynamics within minutes, and
generates predictions on submillisecond time scales. It retains predictive
performance throughout many time steps into the future and is fast enough to
serve as a component of closed-loop causal experiments.
Related papers
- Inferring stochastic low-rank recurrent neural networks from neural data [5.179844449042386]
A central aim in computational neuroscience is to relate the activity of large neurons to an underlying dynamical system.
Low-rank recurrent neural networks (RNNs) exhibit such interpretability by having tractable dynamics.
Here, we propose to fit low-rank RNNs with variational sequential Monte Carlo methods.
arXiv Detail & Related papers (2024-06-24T15:57:49Z) - Modeling Randomly Observed Spatiotemporal Dynamical Systems [7.381752536547389]
Currently available neural network-based modeling approaches fall short when faced with data collected randomly over time and space.
In response, we developed a new method that effectively handles such randomly sampled data.
Our model integrates techniques from amortized variational inference, neural differential equations, neural point processes, and implicit neural representations to predict both the dynamics of the system and the timings and locations of future observations.
arXiv Detail & Related papers (2024-06-01T09:03:32Z) - Real-Time Variational Method for Learning Neural Trajectory and its
Dynamics [7.936841911281107]
We introduce the exponential family variational Kalman filter (eVKF), an online Bayesian method aimed at inferring latent trajectories while simultaneously learning the dynamical system generating them.
We derive a closed-form variational analogue to the predict step of the Kalman filter which leads to a provably tighter bound on the ELBO compared to another online variational method.
We validate our method on synthetic and real-world data, and, notably, show that it achieves competitive performance.
arXiv Detail & Related papers (2023-05-18T19:52:46Z) - Continuous time recurrent neural networks: overview and application to
forecasting blood glucose in the intensive care unit [56.801856519460465]
Continuous time autoregressive recurrent neural networks (CTRNNs) are a deep learning model that account for irregular observations.
We demonstrate the application of these models to probabilistic forecasting of blood glucose in a critical care setting.
arXiv Detail & Related papers (2023-04-14T09:39:06Z) - An advanced spatio-temporal convolutional recurrent neural network for
storm surge predictions [73.4962254843935]
We study the capability of artificial neural network models to emulate storm surge based on the storm track/size/intensity history.
This study presents a neural network model that can predict storm surge, informed by a database of synthetic storm simulations.
arXiv Detail & Related papers (2022-04-18T23:42:18Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Robust alignment of cross-session recordings of neural population
activity by behaviour via unsupervised domain adaptation [1.2617078020344619]
We introduce a model capable of inferring behaviourally relevant latent dynamics from previously unseen data recorded from the same animal.
We show that unsupervised domain adaptation combined with a sequential variational autoencoder, trained on several sessions, can achieve good generalisation to unseen data.
arXiv Detail & Related papers (2022-02-12T22:17:30Z) - Dynamic Neural Diversification: Path to Computationally Sustainable
Neural Networks [68.8204255655161]
Small neural networks with a constrained number of trainable parameters, can be suitable resource-efficient candidates for many simple tasks.
We explore the diversity of the neurons within the hidden layer during the learning process.
We analyze how the diversity of the neurons affects predictions of the model.
arXiv Detail & Related papers (2021-09-20T15:12:16Z) - Stochastic Recurrent Neural Network for Multistep Time Series
Forecasting [0.0]
We leverage advances in deep generative models and the concept of state space models to propose an adaptation of the recurrent neural network for time series forecasting.
Our model preserves the architectural workings of a recurrent neural network for which all relevant information is encapsulated in its hidden states, and this flexibility allows our model to be easily integrated into any deep architecture for sequential modelling.
arXiv Detail & Related papers (2021-04-26T01:43:43Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - The Neural Coding Framework for Learning Generative Models [91.0357317238509]
We propose a novel neural generative model inspired by the theory of predictive processing in the brain.
In a similar way, artificial neurons in our generative model predict what neighboring neurons will do, and adjust their parameters based on how well the predictions matched reality.
arXiv Detail & Related papers (2020-12-07T01:20:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.