Discovering mesoscopic descriptions of collective movement with neural
stochastic modelling
- URL: http://arxiv.org/abs/2303.09906v2
- Date: Thu, 18 Jan 2024 05:42:20 GMT
- Title: Discovering mesoscopic descriptions of collective movement with neural
stochastic modelling
- Authors: Utkarsh Pratiush, Arshed Nabeel, Vishwesha Guttal, Prathosh AP
- Abstract summary: Collective motion at small to medium group sizes ($sim$10-1000 individuals, also called the meso') can show nontrivial features due to order.
Here, we use a physics-inspired, network based approach to characterize the neural group dynamics of interacting individuals.
We apply this technique on both synthetic and real-world datasets, and identify the deterministic and aspects of the dynamics using drift and diffusion fields.
- Score: 4.7163839266526315
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Collective motion is an ubiquitous phenomenon in nature, inspiring engineers,
physicists and mathematicians to develop mathematical models and bio-inspired
designs. Collective motion at small to medium group sizes ($\sim$10-1000
individuals, also called the `mesoscale'), can show nontrivial features due to
stochasticity. Therefore, characterizing both the deterministic and stochastic
aspects of the dynamics is crucial in the study of mesoscale collective
phenomena. Here, we use a physics-inspired, neural-network based approach to
characterize the stochastic group dynamics of interacting individuals, through
a stochastic differential equation (SDE) that governs the collective dynamics
of the group. We apply this technique on both synthetic and real-world
datasets, and identify the deterministic and stochastic aspects of the dynamics
using drift and diffusion fields, enabling us to make novel inferences about
the nature of order in these systems.
Related papers
- Artificial Kuramoto Oscillatory Neurons [65.16453738828672]
We introduce Artificial Kuramotoy Neurons (AKOrN) as a dynamical alternative to threshold units.
We show that this idea provides performance improvements across a wide spectrum of tasks.
We believe that these empirical results show the importance of our assumptions at the most basic neuronal level of neural representation.
arXiv Detail & Related papers (2024-10-17T17:47:54Z) - Automated Discovery of Continuous Dynamics from Videos [4.690264156292023]
We propose an approach to discover a set of state variables that preserve the smoothness of the system dynamics.
We construct a vector field representing the system's dynamics equation, automatically from video streams without prior physical knowledge.
arXiv Detail & Related papers (2024-10-14T03:37:02Z) - Discrete, compositional, and symbolic representations through attractor dynamics [51.20712945239422]
We introduce a novel neural systems model that integrates attractor dynamics with symbolic representations to model cognitive processes akin to the probabilistic language of thought (PLoT)
Our model segments the continuous representational space into discrete basins, with attractor states corresponding to symbolic sequences, that reflect the semanticity and compositionality characteristic of symbolic systems through unsupervised learning, rather than relying on pre-defined primitives.
This approach establishes a unified framework that integrates both symbolic and sub-symbolic processing through neural dynamics, a neuroplausible substrate with proven expressivity in AI, offering a more comprehensive model that mirrors the complex duality of cognitive operations
arXiv Detail & Related papers (2023-10-03T05:40:56Z) - Learning minimal representations of stochastic processes with
variational autoencoders [52.99137594502433]
We introduce an unsupervised machine learning approach to determine the minimal set of parameters required to describe a process.
Our approach enables for the autonomous discovery of unknown parameters describing processes.
arXiv Detail & Related papers (2023-07-21T14:25:06Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Action Matching: Learning Stochastic Dynamics from Samples [10.46643972142224]
Action Matching is a method for learning a rich family of dynamics using only independent samples from its time evolution.
We derive a tractable training objective, which does not rely on explicit assumptions about the underlying dynamics.
Inspired by connections with optimal transport, we derive extensions of Action Matching to learn differential equations and dynamics involving creation and destruction of probability mass.
arXiv Detail & Related papers (2022-10-13T01:49:48Z) - MoDi: Unconditional Motion Synthesis from Diverse Data [51.676055380546494]
We present MoDi, an unconditional generative model that synthesizes diverse motions.
Our model is trained in a completely unsupervised setting from a diverse, unstructured and unlabeled motion dataset.
We show that despite the lack of any structure in the dataset, the latent space can be semantically clustered.
arXiv Detail & Related papers (2022-06-16T09:06:25Z) - A duality connecting neural network and cosmological dynamics [0.0]
We show that the dynamics of neural networks trained with gradient descent and the dynamics of scalar fields in a flat, vacuum energy dominated Universe are structurally related.
This duality provides the framework for synergies between these systems, to understand and explain neural network dynamics.
arXiv Detail & Related papers (2022-02-22T19:00:01Z) - The Role of Isomorphism Classes in Multi-Relational Datasets [6.419762264544509]
We show that isomorphism leakage overestimates performance in multi-relational inference.
We propose isomorphism-aware synthetic benchmarks for model evaluation.
We also demonstrate that isomorphism classes can be utilised through a simple prioritisation scheme.
arXiv Detail & Related papers (2020-09-30T12:15:24Z) - Euclideanizing Flows: Diffeomorphic Reduction for Learning Stable
Dynamical Systems [74.80320120264459]
We present an approach to learn such motions from a limited number of human demonstrations.
The complex motions are encoded as rollouts of a stable dynamical system.
The efficacy of this approach is demonstrated through validation on an established benchmark as well demonstrations collected on a real-world robotic system.
arXiv Detail & Related papers (2020-05-27T03:51:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.