Neural Physicist: Learning Physical Dynamics from Image Sequences
- URL: http://arxiv.org/abs/2006.05044v1
- Date: Tue, 9 Jun 2020 04:36:51 GMT
- Title: Neural Physicist: Learning Physical Dynamics from Image Sequences
- Authors: Baocheng Zhu, Shijun Wang and James Zhang
- Abstract summary: We present a novel architecture named Neural Physicist (NeurPhy) to learn physical dynamics directly from image sequences using deep neural networks.
Our model can not only extract the physically meaningful state representations, but also learn the state transition dynamics enabling long-term predictions for unseen image sequences.
- Score: 0.6445605125467573
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: We present a novel architecture named Neural Physicist (NeurPhy) to learn
physical dynamics directly from image sequences using deep neural networks. For
any physical system, given the global system parameters, the time evolution of
states is governed by the underlying physical laws. How to learn meaningful
system representations in an end-to-end way and estimate accurate state
transition dynamics facilitating long-term prediction have been long-standing
challenges. In this paper, by leveraging recent progresses in representation
learning and state space models (SSMs), we propose NeurPhy, which uses
variational auto-encoder (VAE) to extract underlying Markovian dynamic state at
each time step, neural process (NP) to extract the global system parameters,
and a non-linear non-recurrent stochastic state space model to learn the
physical dynamic transition. We apply NeurPhy to two physical experimental
environments, i.e., damped pendulum and planetary orbits motion, and achieve
promising results. Our model can not only extract the physically meaningful
state representations, but also learn the state transition dynamics enabling
long-term predictions for unseen image sequences. Furthermore, from the
manifold dimension of the latent state space, we can easily identify the degree
of freedom (DoF) of the underlying physical systems.
Related papers
- A scalable generative model for dynamical system reconstruction from neuroimaging data [5.777167013394619]
Data-driven inference of the generative dynamics underlying a set of observed time series is of growing interest in machine learning.
Recent breakthroughs in training techniques for state space models (SSMs) specifically geared toward dynamical systems reconstruction (DSR) enable to recover the underlying system.
We propose a novel algorithm that solves this problem and scales exceptionally well with model dimensionality and filter length.
arXiv Detail & Related papers (2024-11-05T09:45:57Z) - Neural Material Adaptor for Visual Grounding of Intrinsic Dynamics [48.99021224773799]
We propose the Neural Material Adaptor (NeuMA), which integrates existing physical laws with learned corrections.
We also propose Particle-GS, a particle-driven 3D Gaussian Splatting variant that bridges simulation and observed images.
arXiv Detail & Related papers (2024-10-10T17:43:36Z) - Transport-Embedded Neural Architecture: Redefining the Landscape of physics aware neural models in fluid mechanics [0.0]
A physical problem, the Taylor-Green vortex, defined on a bi-periodic domain, is used as a benchmark to evaluate the performance of both the standard physics-informed neural network and our model.
Results exhibit that while the standard physics-informed neural network fails to predict the solution accurately and merely returns the initial condition for the entire time span, our model successfully captures the temporal changes in the physics.
arXiv Detail & Related papers (2024-10-05T10:32:51Z) - Latent Intuitive Physics: Learning to Transfer Hidden Physics from A 3D Video [58.043569985784806]
We introduce latent intuitive physics, a transfer learning framework for physics simulation.
It can infer hidden properties of fluids from a single 3D video and simulate the observed fluid in novel scenes.
We validate our model in three ways: (i) novel scene simulation with the learned visual-world physics, (ii) future prediction of the observed fluid dynamics, and (iii) supervised particle simulation.
arXiv Detail & Related papers (2024-06-18T16:37:44Z) - Equivariant Graph Neural Operator for Modeling 3D Dynamics [148.98826858078556]
We propose Equivariant Graph Neural Operator (EGNO) to directly models dynamics as trajectories instead of just next-step prediction.
EGNO explicitly learns the temporal evolution of 3D dynamics where we formulate the dynamics as a function over time and learn neural operators to approximate it.
Comprehensive experiments in multiple domains, including particle simulations, human motion capture, and molecular dynamics, demonstrate the significantly superior performance of EGNO against existing methods.
arXiv Detail & Related papers (2024-01-19T21:50:32Z) - Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - Learning Physical Dynamics with Subequivariant Graph Neural Networks [99.41677381754678]
Graph Neural Networks (GNNs) have become a prevailing tool for learning physical dynamics.
Physical laws abide by symmetry, which is a vital inductive bias accounting for model generalization.
Our model achieves on average over 3% enhancement in contact prediction accuracy across 8 scenarios on Physion and 2X lower rollout MSE on RigidFall.
arXiv Detail & Related papers (2022-10-13T10:00:30Z) - Neural Implicit Representations for Physical Parameter Inference from a Single Video [49.766574469284485]
We propose to combine neural implicit representations for appearance modeling with neural ordinary differential equations (ODEs) for modelling physical phenomena.
Our proposed model combines several unique advantages: (i) Contrary to existing approaches that require large training datasets, we are able to identify physical parameters from only a single video.
The use of neural implicit representations enables the processing of high-resolution videos and the synthesis of photo-realistic images.
arXiv Detail & Related papers (2022-04-29T11:55:35Z) - Physics-guided Deep Markov Models for Learning Nonlinear Dynamical
Systems with Uncertainty [6.151348127802708]
We propose a physics-guided framework, termed Physics-guided Deep Markov Model (PgDMM)
The proposed framework takes advantage of the expressive power of deep learning, while retaining the driving physics of the dynamical system.
arXiv Detail & Related papers (2021-10-16T16:35:12Z) - ST-PCNN: Spatio-Temporal Physics-Coupled Neural Networks for Dynamics
Forecasting [15.265694039283106]
We propose a physics-coupled neural network model to learn parameters governing the physics of the system.
A-temporal physics-coupled neural network (ST-PCNN) model is proposed to achieve three goals.
Experiments, using simulated and field-collected ocean data, validate that ST-PCNN outperforms existing physics-informed models.
arXiv Detail & Related papers (2021-08-12T19:34:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.