Discover physical concepts and equations with machine learning
- URL: http://arxiv.org/abs/2412.12161v2
- Date: Tue, 22 Apr 2025 22:11:42 GMT
- Title: Discover physical concepts and equations with machine learning
- Authors: Bao-Bing Li, Yi Gu, Shao-Feng Wu,
- Abstract summary: We propose a model that combines Variational Autoencoders (VAE) with Neural Ordinary Differential Equations (Neural ODEs)<n>This allows us to simultaneously discover physical concepts and governing equations from simulated experimental data.<n>We apply the model to several examples inspired by the history of physics, including Copernicus' heliocentrism, Newton's law of gravity, Schr"odinger's wave mechanics, and Pauli's spin-magnetic formulation.
- Score: 7.565272546753481
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Machine learning can uncover physical concepts or physical equations when prior knowledge from the other is available. However, these two aspects are often intertwined and cannot be discovered independently. We extend SciNet, which is a neural network architecture that simulates the human physical reasoning process for physics discovery, by proposing a model that combines Variational Autoencoders (VAE) with Neural Ordinary Differential Equations (Neural ODEs). This allows us to simultaneously discover physical concepts and governing equations from simulated experimental data across various physical systems. We apply the model to several examples inspired by the history of physics, including Copernicus' heliocentrism, Newton's law of gravity, Schr\"odinger's wave mechanics, and Pauli's spin-magnetic formulation. The results demonstrate that the correct physical theories can emerge in the neural network.
Related papers
- Cosmos-Reason1: From Physical Common Sense To Embodied Reasoning [76.94237859217469]
Physical AI systems need to perceive, understand, and perform complex actions in the physical world.
We present models that can understand the physical world generate appropriate embodied decisions.
We use a hierarchical ontology that captures fundamental knowledge about space, time, and physics.
For embodied reasoning, we rely on a two-dimensional ontology that generalizes across different physical embodiments.
arXiv Detail & Related papers (2025-03-18T22:06:58Z) - Latent Intuitive Physics: Learning to Transfer Hidden Physics from A 3D Video [58.043569985784806]
We introduce latent intuitive physics, a transfer learning framework for physics simulation.
It can infer hidden properties of fluids from a single 3D video and simulate the observed fluid in novel scenes.
We validate our model in three ways: (i) novel scene simulation with the learned visual-world physics, (ii) future prediction of the observed fluid dynamics, and (iii) supervised particle simulation.
arXiv Detail & Related papers (2024-06-18T16:37:44Z) - PhyRecon: Physically Plausible Neural Scene Reconstruction [81.73129450090684]
We introduce PHYRECON, the first approach to leverage both differentiable rendering and differentiable physics simulation to learn implicit surface representations.
Central to this design is an efficient transformation between SDF-based implicit representations and explicit surface points.
Our results also exhibit superior physical stability in physical simulators, with at least a 40% improvement across all datasets.
arXiv Detail & Related papers (2024-04-25T15:06:58Z) - Physics-Encoded Graph Neural Networks for Deformation Prediction under
Contact [87.69278096528156]
In robotics, it's crucial to understand object deformation during tactile interactions.
We introduce a method using Physics-Encoded Graph Neural Networks (GNNs) for such predictions.
We've made our code and dataset public to advance research in robotic simulation and grasping.
arXiv Detail & Related papers (2024-02-05T19:21:52Z) - Quantum-Inspired Neural Network Model of Optical Illusions [0.0]
We train a deep neural network model to simulate the human's perception of the Necker cube.
Our results will find applications in video games and virtual reality systems employed for training of astronauts and operators of unmanned aerial vehicles.
arXiv Detail & Related papers (2023-12-06T12:10:56Z) - Fourier Neural Differential Equations for learning Quantum Field
Theories [57.11316818360655]
A Quantum Field Theory is defined by its interaction Hamiltonian, and linked to experimental data by the scattering matrix.
In this paper, NDE models are used to learn theory, Scalar-Yukawa theory and Scalar Quantum Electrodynamics.
The interaction Hamiltonian of a theory can be extracted from network parameters.
arXiv Detail & Related papers (2023-11-28T22:11:15Z) - Towards Cross Domain Generalization of Hamiltonian Representation via Meta Learning [2.3020018305241337]
In this work, we take a significant leap forward by targeting cross domain generalization within the field of Hamiltonian dynamics.
We model our system with a graph neural network (GNN) and employ a meta learning algorithm to enable the model to gain experience over a distribution of systems.
We demonstrate that the meta-trained model captures the generalized Hamiltonian representation that is consistent across different physical domains.
arXiv Detail & Related papers (2022-12-02T13:47:21Z) - Physically Consistent Neural ODEs for Learning Multi-Physics Systems [0.0]
In this paper, we leverage the framework of Irreversible port-Hamiltonian Systems (IPHS), which can describe most multi-physics systems.
We propose Physically Consistent NODEs (PC-NODEs) to learn parameters from data.
We demonstrate the effectiveness of the proposed method by learning the thermodynamics of a building from the real-world measurements.
arXiv Detail & Related papers (2022-11-11T11:20:35Z) - Human Trajectory Prediction via Neural Social Physics [63.62824628085961]
Trajectory prediction has been widely pursued in many fields, and many model-based and model-free methods have been explored.
We propose a new method combining both methodologies based on a new Neural Differential Equation model.
Our new model (Neural Social Physics or NSP) is a deep neural network within which we use an explicit physics model with learnable parameters.
arXiv Detail & Related papers (2022-07-21T12:11:18Z) - Applications of physics informed neural operators [2.588973722689844]
We present an end-to-end framework to learn partial differential equations.
We first demonstrate that our methods reproduce the accuracy and performance of other neural operators.
We apply our physics-informed neural operators to learn new types of equations, including the 2D Burgers equation.
arXiv Detail & Related papers (2022-03-23T18:00:05Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - The Physics of Machine Learning: An Intuitive Introduction for the
Physical Scientist [0.0]
This article is intended for physical scientists who wish to gain deeper insights into machine learning algorithms.
We begin with a review of two energy-based machine learning algorithms, Hopfield networks and Boltzmann machines, and their connection to the Ising model.
We then delve into additional, more "practical," machine learning architectures including feedforward neural networks, convolutional neural networks, and autoencoders.
arXiv Detail & Related papers (2021-11-27T15:12:42Z) - Dynamic Visual Reasoning by Learning Differentiable Physics Models from
Video and Language [92.7638697243969]
We propose a unified framework that can jointly learn visual concepts and infer physics models of objects from videos and language.
This is achieved by seamlessly integrating three components: a visual perception module, a concept learner, and a differentiable physics engine.
arXiv Detail & Related papers (2021-10-28T17:59:13Z) - The Autodidactic Universe [0.8795040582681388]
We present an approach to cosmology in which the Universe learns its own physical laws.
We discover maps that put each of these matrix models in correspondence with both a gauge/gravity theory and a mathematical model of a learning machine.
We discuss in detail what it means to say that learning takes place in autodidactic systems, where there is no supervision.
arXiv Detail & Related papers (2021-03-29T02:25:02Z) - Partial Differential Equations is All You Need for Generating Neural Architectures -- A Theory for Physical Artificial Intelligence Systems [40.20472268839781]
We generalize the reaction-diffusion equation in statistical physics, Schr"odinger equation in quantum mechanics, Helmholtz equation in paraxial optics.
We take finite difference method to discretize NPDE for finding numerical solution.
Basic building blocks of deep neural network architecture, including multi-layer perceptron, convolutional neural network and recurrent neural networks, are generated.
arXiv Detail & Related papers (2021-03-10T00:05:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.