Physics-informed Reinforcement Learning for Perception and Reasoning
about Fluids
- URL: http://arxiv.org/abs/2203.05775v1
- Date: Fri, 11 Mar 2022 07:01:23 GMT
- Title: Physics-informed Reinforcement Learning for Perception and Reasoning
about Fluids
- Authors: Beatriz Moya, Alberto Badias, David Gonzalez, Francisco Chinesta,
Elias Cueto
- Abstract summary: We propose a physics-informed reinforcement learning strategy for fluid perception and reasoning from observations.
We develop a method for the tracking (perception) and analysis (reasoning) of any previously unseen liquid whose free surface is observed with a commodity camera.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Learning and reasoning about physical phenomena is still a challenge in
robotics development, and computational sciences play a capital role in the
search for accurate methods able to provide explanations for past events and
rigorous forecasts of future situations. We propose a physics-informed
reinforcement learning strategy for fluid perception and reasoning from
observations. As a model problem, we take the sloshing phenomena of different
fluids contained in a glass. Starting from full-field and high-resolution
synthetic data for a particular fluid, we develop a method for the tracking
(perception) and analysis (reasoning) of any previously unseen liquid whose
free surface is observed with a commodity camera. This approach demonstrates
the importance of physics and knowledge not only in data-driven (grey box)
modeling but also in the correction for real physics adaptation in low data
regimes and partial observations of the dynamics. The method here presented is
extensible to other domains such as the development of cognitive digital twins,
able to learn from observation of phenomena for which they have not been
trained explicitly.
Related papers
- PIETRA: Physics-Informed Evidential Learning for Traversing Out-of-Distribution Terrain [35.21102019590834]
Physics-Informed Evidential Traversability (PIETRA) is a self-supervised learning framework that integrates physics priors directly into the mathematical formulation of evidential neural networks.
Our evidential network seamlessly transitions between learned and physics-based predictions for out-of-distribution inputs.
PIETRA improves both learning accuracy and navigation performance in environments with significant distribution shifts.
arXiv Detail & Related papers (2024-09-04T18:01:10Z) - Machine Learning with Physics Knowledge for Prediction: A Survey [16.96920919164813]
This survey examines the broad suite of methods and models for combining machine learning with physics knowledge for prediction and forecast.
The survey has two parts. The first considers incorporating physics knowledge on an architectural level through objective functions, structured predictive models, and data augmentation.
The second considers data as physics knowledge, which motivates looking at multi-task, meta, and contextual learning as an alternative approach to incorporating physics knowledge in a data-driven fashion.
arXiv Detail & Related papers (2024-08-19T09:36:07Z) - Latent Intuitive Physics: Learning to Transfer Hidden Physics from A 3D Video [58.043569985784806]
We introduce latent intuitive physics, a transfer learning framework for physics simulation.
It can infer hidden properties of fluids from a single 3D video and simulate the observed fluid in novel scenes.
We validate our model in three ways: (i) novel scene simulation with the learned visual-world physics, (ii) future prediction of the observed fluid dynamics, and (iii) supervised particle simulation.
arXiv Detail & Related papers (2024-06-18T16:37:44Z) - Physics-Encoded Graph Neural Networks for Deformation Prediction under
Contact [87.69278096528156]
In robotics, it's crucial to understand object deformation during tactile interactions.
We introduce a method using Physics-Encoded Graph Neural Networks (GNNs) for such predictions.
We've made our code and dataset public to advance research in robotic simulation and grasping.
arXiv Detail & Related papers (2024-02-05T19:21:52Z) - X-VoE: Measuring eXplanatory Violation of Expectation in Physical Events [75.94926117990435]
This study introduces X-VoE, a benchmark dataset to assess AI agents' grasp of intuitive physics.
X-VoE establishes a higher bar for the explanatory capacities of intuitive physics models.
We present an explanation-based learning system that captures physics dynamics and infers occluded object states.
arXiv Detail & Related papers (2023-08-21T03:28:23Z) - NeuroFluid: Fluid Dynamics Grounding with Particle-Driven Neural
Radiance Fields [65.07940731309856]
Deep learning has shown great potential for modeling the physical dynamics of complex particle systems such as fluids.
In this paper, we consider a partially observable scenario known as fluid dynamics grounding.
We propose a differentiable two-stage network named NeuroFluid.
It is shown to reasonably estimate the underlying physics of fluids with different initial shapes, viscosity, and densities.
arXiv Detail & Related papers (2022-03-03T15:13:29Z) - Physics perception in sloshing scenes with guaranteed thermodynamic
consistency [0.0]
We propose a strategy to learn the full state of sloshing liquids from measurements of the free surface.
Our approach is based on recurrent neural networks (RNN) that project the limited information available to a reduced-order manifold.
arXiv Detail & Related papers (2021-06-24T20:13:56Z) - Physics-Integrated Variational Autoencoders for Robust and Interpretable
Generative Modeling [86.9726984929758]
We focus on the integration of incomplete physics models into deep generative models.
We propose a VAE architecture in which a part of the latent space is grounded by physics.
We demonstrate generative performance improvements over a set of synthetic and real-world datasets.
arXiv Detail & Related papers (2021-02-25T20:28:52Z) - Visual Grounding of Learned Physical Models [66.04898704928517]
Humans intuitively recognize objects' physical properties and predict their motion, even when the objects are engaged in complicated interactions.
We present a neural model that simultaneously reasons about physics and makes future predictions based on visual and dynamics priors.
Experiments show that our model can infer the physical properties within a few observations, which allows the model to quickly adapt to unseen scenarios and make accurate predictions into the future.
arXiv Detail & Related papers (2020-04-28T17:06:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.