Modulated Neural ODEs
- URL: http://arxiv.org/abs/2302.13262v3
- Date: Mon, 13 Nov 2023 08:09:52 GMT
- Title: Modulated Neural ODEs
- Authors: Ilze Amanda Auzina, \c{C}a\u{g}atay Y{\i}ld{\i}z, Sara Magliacane,
Matthias Bethge and Efstratios Gavves
- Abstract summary: We introduce Modulated Neural ODEs (MoNODEs), a novel framework that sets apart dynamics states from underlying static factors of variation.
We test MoNODE on oscillating systems, videos and human walking trajectories, where each trajectory has trajectory-specific modulation.
- Score: 32.2290908839375
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural ordinary differential equations (NODEs) have been proven useful for
learning non-linear dynamics of arbitrary trajectories. However, current NODE
methods capture variations across trajectories only via the initial state value
or by auto-regressive encoder updates. In this work, we introduce Modulated
Neural ODEs (MoNODEs), a novel framework that sets apart dynamics states from
underlying static factors of variation and improves the existing NODE methods.
In particular, we introduce $\textit{time-invariant modulator variables}$ that
are learned from the data. We incorporate our proposed framework into four
existing NODE variants. We test MoNODE on oscillating systems, videos and human
walking trajectories, where each trajectory has trajectory-specific modulation.
Our framework consistently improves the existing model ability to generalize to
new dynamic parameterizations and to perform far-horizon forecasting. In
addition, we verify that the proposed modulator variables are informative of
the true unknown factors of variation as measured by $R^2$ scores.
Related papers
- Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces a novel family of deep dynamical models designed to represent continuous-time sequence data.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experiments on oscillating systems, videos and real-world state sequences (MuJoCo) illustrate that ODEs with the learnable energy-based prior outperform existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Mamba-FSCIL: Dynamic Adaptation with Selective State Space Model for Few-Shot Class-Incremental Learning [113.89327264634984]
Few-shot class-incremental learning (FSCIL) confronts the challenge of integrating new classes into a model with minimal training samples.
Traditional methods widely adopt static adaptation relying on a fixed parameter space to learn from data that arrive sequentially.
We propose a dual selective SSM projector that dynamically adjusts the projection parameters based on the intermediate features for dynamic adaptation.
arXiv Detail & Related papers (2024-07-08T17:09:39Z) - Neural Context Flows for Meta-Learning of Dynamical Systems [0.7373617024876724]
We introduce Neural Context Flow (NCF), a Meta-Learning framework that includes uncertainty estimation.
NCF achieves state-of-the-art Out-of-Distribution performance on 5 out of 6 linear and non-linear benchmark problems.
Our findings highlight the potential implications of NCF for foundational models in the physical sciences.
arXiv Detail & Related papers (2024-05-03T15:02:21Z) - Equivariant Graph Neural Operator for Modeling 3D Dynamics [148.98826858078556]
We propose Equivariant Graph Neural Operator (EGNO) to directly models dynamics as trajectories instead of just next-step prediction.
EGNO explicitly learns the temporal evolution of 3D dynamics where we formulate the dynamics as a function over time and learn neural operators to approximate it.
Comprehensive experiments in multiple domains, including particle simulations, human motion capture, and molecular dynamics, demonstrate the significantly superior performance of EGNO against existing methods.
arXiv Detail & Related papers (2024-01-19T21:50:32Z) - Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - EqMotion: Equivariant Multi-agent Motion Prediction with Invariant
Interaction Reasoning [83.11657818251447]
We propose EqMotion, an efficient equivariant motion prediction model with invariant interaction reasoning.
We conduct experiments for the proposed model on four distinct scenarios: particle dynamics, molecule dynamics, human skeleton motion prediction and pedestrian trajectory prediction.
Our method achieves state-of-the-art prediction performances on all the four tasks, improving by 24.0/30.1/8.6/9.2%.
arXiv Detail & Related papers (2023-03-20T05:23:46Z) - Human Trajectory Prediction via Neural Social Physics [63.62824628085961]
Trajectory prediction has been widely pursued in many fields, and many model-based and model-free methods have been explored.
We propose a new method combining both methodologies based on a new Neural Differential Equation model.
Our new model (Neural Social Physics or NSP) is a deep neural network within which we use an explicit physics model with learnable parameters.
arXiv Detail & Related papers (2022-07-21T12:11:18Z) - Neural Modal ODEs: Integrating Physics-based Modeling with Neural ODEs
for Modeling High Dimensional Monitored Structures [9.065343126886093]
This paper proposes a framework - termed Neural Modal ODEs - to integrate physics-based modeling with deep learning.
An autoencoder learns the abstract mappings from the first few items of observational data to the initial values of latent variables.
The decoder of the proposed model adopts the eigenmodes derived from an eigen-analysis applied to the linearized portion of a physics-based model.
arXiv Detail & Related papers (2022-07-16T09:30:20Z) - Incorporating NODE with Pre-trained Neural Differential Operator for
Learning Dynamics [73.77459272878025]
We propose to enhance the supervised signal in learning dynamics by pre-training a neural differential operator (NDO)
NDO is pre-trained on a class of symbolic functions, and it learns the mapping between the trajectory samples of these functions to their derivatives.
We provide theoretical guarantee on that the output of NDO can well approximate the ground truth derivatives by proper tuning the complexity of the library.
arXiv Detail & Related papers (2021-06-08T08:04:47Z) - Go with the Flow: Adaptive Control for Neural ODEs [10.265713480189484]
We describe a new module called neurally controlled ODE (N-CODE) designed to improve the expressivity of NODEs.
N-CODE modules are dynamic variables governed by a trainable map from initial or current activation state.
A single module is sufficient for learning a distribution on non-autonomous flows that adaptively drive neural representations.
arXiv Detail & Related papers (2020-06-16T22:21:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.