Discover Physical Concepts and Equations with Machine Learning
- URL: http://arxiv.org/abs/2412.12161v1
- Date: Wed, 11 Dec 2024 15:30:21 GMT
- Title: Discover Physical Concepts and Equations with Machine Learning
- Authors: Bao-Bing Li, Yi Gu, Shao-Feng Wu,
- Abstract summary: We extend SciNet, which is a neural network architecture that simulates the human physical reasoning process for physics discovery.
We apply the model to several key examples inspired by the history of physics, including Copernicus' heliocentric solar system.
Results demonstrate that the neural network successfully reconstructs the corresponding theories.
- Score: 7.565272546753481
- License:
- Abstract: Machine learning can uncover physical concepts or physical equations when prior knowledge from another one is available. However, in many cases, these two aspects are coupled and cannot be discovered independently. We extend SciNet, which is a neural network architecture that simulates the human physical reasoning process for physics discovery, by proposing a model that combines Variational Autoencoders (VAEs) with Neural Ordinary Differential Equations (Neural ODEs). This allows us to simultaneously discover physical concepts and governing equations from simulated experimental data across diverse physical systems. We apply the model to several key examples inspired by the history of physics, including Copernicus' heliocentric solar system, Newton's law of universal gravitation, the wave function together with the Schr\"odinger equation, and spin-1/2 along with the Pauli equation. The results demonstrate that the neural network successfully reconstructs the corresponding theories.
Related papers
- Fourier Neural Differential Equations for learning Quantum Field
Theories [57.11316818360655]
A Quantum Field Theory is defined by its interaction Hamiltonian, and linked to experimental data by the scattering matrix.
In this paper, NDE models are used to learn theory, Scalar-Yukawa theory and Scalar Quantum Electrodynamics.
The interaction Hamiltonian of a theory can be extracted from network parameters.
arXiv Detail & Related papers (2023-11-28T22:11:15Z) - Physics-informed machine learning of the correlation functions in bulk
fluids [2.1255150235172837]
The Ornstein-Zernike (OZ) equation is the fundamental equation for pair correlation function computations in the modern integral equation theory for liquids.
In this work, machine learning models, notably physics-informed neural networks and physics-informed neural operator networks, are explored to solve the OZ equation.
arXiv Detail & Related papers (2023-09-02T00:11:48Z) - Learning the solution operator of two-dimensional incompressible
Navier-Stokes equations using physics-aware convolutional neural networks [68.8204255655161]
We introduce a technique with which it is possible to learn approximate solutions to the steady-state Navier--Stokes equations in varying geometries without the need of parametrization.
The results of our physics-aware CNN are compared to a state-of-the-art data-based approach.
arXiv Detail & Related papers (2023-08-04T05:09:06Z) - Physically Consistent Neural ODEs for Learning Multi-Physics Systems [0.0]
In this paper, we leverage the framework of Irreversible port-Hamiltonian Systems (IPHS), which can describe most multi-physics systems.
We propose Physically Consistent NODEs (PC-NODEs) to learn parameters from data.
We demonstrate the effectiveness of the proposed method by learning the thermodynamics of a building from the real-world measurements.
arXiv Detail & Related papers (2022-11-11T11:20:35Z) - Human Trajectory Prediction via Neural Social Physics [63.62824628085961]
Trajectory prediction has been widely pursued in many fields, and many model-based and model-free methods have been explored.
We propose a new method combining both methodologies based on a new Neural Differential Equation model.
Our new model (Neural Social Physics or NSP) is a deep neural network within which we use an explicit physics model with learnable parameters.
arXiv Detail & Related papers (2022-07-21T12:11:18Z) - Applications of physics informed neural operators [2.588973722689844]
We present an end-to-end framework to learn partial differential equations.
We first demonstrate that our methods reproduce the accuracy and performance of other neural operators.
We apply our physics-informed neural operators to learn new types of equations, including the 2D Burgers equation.
arXiv Detail & Related papers (2022-03-23T18:00:05Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - Dynamic Visual Reasoning by Learning Differentiable Physics Models from
Video and Language [92.7638697243969]
We propose a unified framework that can jointly learn visual concepts and infer physics models of objects from videos and language.
This is achieved by seamlessly integrating three components: a visual perception module, a concept learner, and a differentiable physics engine.
arXiv Detail & Related papers (2021-10-28T17:59:13Z) - The Autodidactic Universe [0.8795040582681388]
We present an approach to cosmology in which the Universe learns its own physical laws.
We discover maps that put each of these matrix models in correspondence with both a gauge/gravity theory and a mathematical model of a learning machine.
We discuss in detail what it means to say that learning takes place in autodidactic systems, where there is no supervision.
arXiv Detail & Related papers (2021-03-29T02:25:02Z) - Partial Differential Equations is All You Need for Generating Neural Architectures -- A Theory for Physical Artificial Intelligence Systems [40.20472268839781]
We generalize the reaction-diffusion equation in statistical physics, Schr"odinger equation in quantum mechanics, Helmholtz equation in paraxial optics.
We take finite difference method to discretize NPDE for finding numerical solution.
Basic building blocks of deep neural network architecture, including multi-layer perceptron, convolutional neural network and recurrent neural networks, are generated.
arXiv Detail & Related papers (2021-03-10T00:05:46Z) - Learning to Simulate Complex Physics with Graph Networks [68.43901833812448]
We present a machine learning framework and model implementation that can learn to simulate a wide variety of challenging physical domains.
Our framework---which we term "Graph Network-based Simulators" (GNS)--represents the state of a physical system with particles, expressed as nodes in a graph, and computes dynamics via learned message-passing.
Our results show that our model can generalize from single-timestep predictions with thousands of particles during training, to different initial conditions, thousands of timesteps, and at least an order of magnitude more particles at test time.
arXiv Detail & Related papers (2020-02-21T16:44:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.