A hybrid approach for solving the gravitational N-body problem with
Artificial Neural Networks
- URL: http://arxiv.org/abs/2310.20398v1
- Date: Tue, 31 Oct 2023 12:20:20 GMT
- Title: A hybrid approach for solving the gravitational N-body problem with
Artificial Neural Networks
- Authors: Veronica Saz Ulibarrena, Philipp Horn, Simon Portegies Zwart, Elena
Sellentin, Barry Koren, Maxwell X. Cai
- Abstract summary: We study the use of Artificial Neural Networks (ANNs) to replace expensive parts of the integration of planetary systems.
We compare the results of the numerical integration of a planetary system with asteroids with those obtained by a Hamiltonian Neural Network and a conventional Deep Neural Network.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Simulating the evolution of the gravitational N-body problem becomes
extremely computationally expensive as N increases since the problem complexity
scales quadratically with the number of bodies. We study the use of Artificial
Neural Networks (ANNs) to replace expensive parts of the integration of
planetary systems. Neural networks that include physical knowledge have grown
in popularity in the last few years, although few attempts have been made to
use them to speed up the simulation of the motion of celestial bodies. We study
the advantages and limitations of using Hamiltonian Neural Networks to replace
computationally expensive parts of the numerical simulation. We compare the
results of the numerical integration of a planetary system with asteroids with
those obtained by a Hamiltonian Neural Network and a conventional Deep Neural
Network, with special attention to understanding the challenges of this
problem. Due to the non-linear nature of the gravitational equations of motion,
errors in the integration propagate. To increase the robustness of a method
that uses neural networks, we propose a hybrid integrator that evaluates the
prediction of the network and replaces it with the numerical solution if
considered inaccurate. Hamiltonian Neural Networks can make predictions that
resemble the behavior of symplectic integrators but are challenging to train
and in our case fail when the inputs differ ~7 orders of magnitude. In
contrast, Deep Neural Networks are easy to train but fail to conserve energy,
leading to fast divergence from the reference solution. The hybrid integrator
designed to include the neural networks increases the reliability of the method
and prevents large energy errors without increasing the computing cost
significantly. For this problem, the use of neural networks results in faster
simulations when the number of asteroids is >70.
Related papers
- Message Passing Variational Autoregressive Network for Solving Intractable Ising Models [6.261096199903392]
Many deep neural networks have been used to solve Ising models, including autoregressive neural networks, convolutional neural networks, recurrent neural networks, and graph neural networks.
Here we propose a variational autoregressive architecture with a message passing mechanism, which can effectively utilize the interactions between spin variables.
The new network trained under an annealing framework outperforms existing methods in solving several prototypical Ising spin Hamiltonians, especially for larger spin systems at low temperatures.
arXiv Detail & Related papers (2024-04-09T11:27:07Z) - Knowledge-Based Convolutional Neural Network for the Simulation and Prediction of Two-Phase Darcy Flows [3.5707423185282656]
Physics-informed neural networks (PINNs) have gained significant prominence as a powerful tool in the field of scientific computing and simulations.
We propose to combine the power of neural networks with the dynamics imposed by the discretized differential equations.
By discretizing the governing equations, the PINN learns to account for the discontinuities and accurately capture the underlying relationships between inputs and outputs.
arXiv Detail & Related papers (2024-04-04T06:56:32Z) - Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - Quantum Neural Network for Quantum Neural Computing [0.0]
We propose a new quantum neural network model for quantum neural computing.
Our model circumvents the problem that the state-space size grows exponentially with the number of neurons.
We benchmark our model for handwritten digit recognition and other nonlinear classification tasks.
arXiv Detail & Related papers (2023-05-15T11:16:47Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - Bayesian Physics-Informed Neural Networks for real-world nonlinear
dynamical systems [0.0]
We integrate data, physics, and uncertainties by combining neural networks, physics-informed modeling, and Bayesian inference.
Our study reveals the inherent advantages and disadvantages of Neural Networks, Bayesian Inference, and a combination of both.
We anticipate that the underlying concepts and trends generalize to more complex disease conditions.
arXiv Detail & Related papers (2022-05-12T19:04:31Z) - Learning Trajectories of Hamiltonian Systems with Neural Networks [81.38804205212425]
We propose to enhance Hamiltonian neural networks with an estimation of a continuous-time trajectory of the modeled system.
We demonstrate that the proposed integration scheme works well for HNNs, especially with low sampling rates, noisy and irregular observations.
arXiv Detail & Related papers (2022-04-11T13:25:45Z) - Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State [66.2457134675891]
Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
arXiv Detail & Related papers (2021-09-29T07:46:54Z) - Reduced-Order Neural Network Synthesis with Robustness Guarantees [0.0]
Machine learning algorithms are being adapted to run locally on board, potentially hardware limited, devices to improve user privacy, reduce latency and be more energy efficient.
To address this issue, a method to automatically synthesize reduced-order neural networks (having fewer neurons) approxing the input/output mapping of a larger one is introduced.
Worst-case bounds for this approximation error are obtained and the approach can be applied to a wide variety of neural networks architectures.
arXiv Detail & Related papers (2021-02-18T12:03:57Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.