A Spacetime Perspective on Dynamical Computation in Neural Information Processing Systems
- URL: http://arxiv.org/abs/2409.13669v1
- Date: Fri, 20 Sep 2024 17:25:37 GMT
- Title: A Spacetime Perspective on Dynamical Computation in Neural Information Processing Systems
- Authors: T. Anderson Keller, Lyle Muller, Terrence J. Sejnowski, Max Welling,
- Abstract summary: We introduce a new'spacetime' perspective on neural computation.
We show that temporal dynamics may be by which natural neural systems encode approximate visual quantities.
- Score: 43.02233537621737
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: There is now substantial evidence for traveling waves and other structured spatiotemporal recurrent neural dynamics in cortical structures; but these observations have typically been difficult to reconcile with notions of topographically organized selectivity and feedforward receptive fields. We introduce a new 'spacetime' perspective on neural computation in which structured selectivity and dynamics are not contradictory but instead are complimentary. We show that spatiotemporal dynamics may be a mechanism by which natural neural systems encode approximate visual, temporal, and abstract symmetries of the world as conserved quantities, thereby enabling improved generalization and long-term working memory.
Related papers
- Neuromorphic Computing with Multi-Frequency Oscillations: A Bio-Inspired Approach to Artificial Intelligence [7.742102806887099]
Despite remarkable capabilities, artificial neural networks exhibit limited flexible, generalizable intelligence.<n>This limitation stems from their fundamental divergence from biological cognition that overlooks both neural regions' functional specialization and the temporal dynamics critical for coordinating these specialized systems.<n>We propose a tripartite brain-inspired architecture comprising functionally specialized perceptual, auxiliary, and executive systems.
arXiv Detail & Related papers (2025-08-04T08:40:33Z) - Langevin Flows for Modeling Neural Latent Dynamics [81.81271685018284]
We introduce LangevinFlow, a sequential Variational Auto-Encoder where the time evolution of latent variables is governed by the underdamped Langevin equation.<n>Our approach incorporates physical priors -- such as inertia, damping, a learned potential function, and forces -- to represent both autonomous and non-autonomous processes in neural systems.<n>Our method outperforms state-of-the-art baselines on synthetic neural populations generated by a Lorenz attractor.
arXiv Detail & Related papers (2025-07-15T17:57:48Z) - Allostatic Control of Persistent States in Spiking Neural Networks for perception and computation [79.16635054977068]
We introduce a novel model for updating perceptual beliefs about the environment by extending the concept of Allostasis to the control of internal representations.
In this paper, we focus on an application in numerical cognition, where a bump of activity in an attractor network is used as a spatial numerical representation.
arXiv Detail & Related papers (2025-03-20T12:28:08Z) - Conservation-informed Graph Learning for Spatiotemporal Dynamics Prediction [84.26340606752763]
In this paper, we introduce the conservation-informed GNN (CiGNN), an end-to-end explainable learning framework.
The network is designed to conform to the general symmetry conservation law via symmetry where conservative and non-conservative information passes over a multiscale space by a latent temporal marching strategy.
Results demonstrate that CiGNN exhibits remarkable baseline accuracy and generalizability, and is readily applicable to learning for prediction of varioustemporal dynamics.
arXiv Detail & Related papers (2024-12-30T13:55:59Z) - Neuron: Learning Context-Aware Evolving Representations for Zero-Shot Skeleton Action Recognition [64.56321246196859]
We propose a novel dyNamically Evolving dUal skeleton-semantic syneRgistic framework.
We first construct the spatial-temporal evolving micro-prototypes and integrate dynamic context-aware side information.
We introduce the spatial compression and temporal memory mechanisms to guide the growth of spatial-temporal micro-prototypes.
arXiv Detail & Related papers (2024-11-18T05:16:11Z) - Artificial Kuramoto Oscillatory Neurons [65.16453738828672]
We introduce Artificial Kuramotoy Neurons (AKOrN) as a dynamical alternative to threshold units.
We show that this idea provides performance improvements across a wide spectrum of tasks.
We believe that these empirical results show the importance of our assumptions at the most basic neuronal level of neural representation.
arXiv Detail & Related papers (2024-10-17T17:47:54Z) - Neural timescales from a computational perspective [5.390514665166601]
Timescales of neural activity are diverse across and within brain areas, and experimental observations suggest that neural timescales reflect information in dynamic environments.
Here, we take a complementary perspective and synthesize three directions where computational methods can distill the broad set of empirical observations into quantitative and testable theories.
arXiv Detail & Related papers (2024-09-04T13:16:20Z) - Learning Spatiotemporal Dynamical Systems from Point Process Observations [7.381752536547389]
Current neural network-based model approaches fall short when faced with data that is collected randomly over time and space.
In response, we developed a new method that can effectively learn from such process observations.
Our model integrates techniques from neural differential equations, neural point processes, implicit neural representations and amortized variational inference.
arXiv Detail & Related papers (2024-06-01T09:03:32Z) - Covariant spatio-temporal receptive fields for neuromorphic computing [1.9365675487641305]
This work combines efforts within scale theory and computational neuroscience to identify theoretically well-founded ways to process-temporal signals in neuromorphic systems.
Our contributions are immediately relevant for signal processing and event-based vision, and can be extended to other processing tasks over space and time, such as memory and control.
arXiv Detail & Related papers (2024-05-01T04:51:10Z) - Exploring neural oscillations during speech perception via surrogate gradient spiking neural networks [59.38765771221084]
We present a physiologically inspired speech recognition architecture compatible and scalable with deep learning frameworks.
We show end-to-end gradient descent training leads to the emergence of neural oscillations in the central spiking neural network.
Our findings highlight the crucial inhibitory role of feedback mechanisms, such as spike frequency adaptation and recurrent connections, in regulating and synchronising neural activity to improve recognition performance.
arXiv Detail & Related papers (2024-04-22T09:40:07Z) - Backpropagation through space, time, and the brain [2.10686639478348]
We introduce General Latent Equilibrium, a computational framework for fully local-temporal credit assignment in physical, dynamical networks of neurons.
In particular, GLE exploits the morphology of dendritic trees to enable more complex information storage and processing in single neurons.
arXiv Detail & Related papers (2024-03-25T16:57:02Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Neural Spatio-Temporal Point Processes [31.474420819149724]
We propose a new class of parameterizations for point-trivial processes which leverage Neural ODEs as a computational method.
We validate our models on data sets from a wide variety of contexts such as seismology, epidemiology, urban mobility, and neuroscience.
arXiv Detail & Related papers (2020-11-09T17:28:23Z) - Nonseparable Symplectic Neural Networks [23.77058934710737]
We propose a novel neural network architecture, Nonseparable Symplectic Neural Networks (NSSNNs)
NSSNNs uncover and embed the symplectic structure of a nonseparable Hamiltonian system from limited observation data.
We show the unique computational merits of our approach to yield long-term, accurate, and robust predictions for large-scale Hamiltonian systems.
arXiv Detail & Related papers (2020-10-23T19:50:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.