Lagrangian Density Space-Time Deep Neural Network Topology
- URL: http://arxiv.org/abs/2207.12209v1
- Date: Thu, 30 Jun 2022 03:29:35 GMT
- Title: Lagrangian Density Space-Time Deep Neural Network Topology
- Authors: Bhupesh Bishnoi
- Abstract summary: We have proposed a "Lagrangian Density Space-Time Deep Neural Networks" (LDDNN) topology.
It is qualified for unsupervised training and learning to predict the dynamics of underlying physical science governed phenomena.
This article will discuss statistical physics interpretation of neural networks in the Lagrangian and Hamiltonian domains.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: As a network-based functional approximator, we have proposed a "Lagrangian
Density Space-Time Deep Neural Networks" (LDDNN) topology. It is qualified for
unsupervised training and learning to predict the dynamics of underlying
physical science governed phenomena. The prototypical network respects the
fundamental conservation laws of nature through the succinctly described
Lagrangian and Hamiltonian density of the system by a given data-set of
generalized nonlinear partial differential equations. The objective is to
parameterize the Lagrangian density over a neural network and directly learn
from it through data instead of hand-crafting an exact time-dependent "Action
solution" of Lagrangian density for the physical system. With this novel
approach, can understand and open up the information inference aspect of the
"Black-box deep machine learning representation" for the physical dynamics of
nature by constructing custom-tailored network interconnect topologies,
activation, and loss/cost functions based on the underlying physical
differential operators. This article will discuss statistical physics
interpretation of neural networks in the Lagrangian and Hamiltonian domains.
Related papers
- Neural Networks-based Random Vortex Methods for Modelling Incompressible Flows [0.0]
We introduce a novel Neural Networks-based approach for approximating solutions to the (2D) incompressible Navier--Stokes equations.
Our algorithm uses a Physics-informed Neural Network, that approximates the vorticity based on a loss function that uses a computationally efficient formulation of the Random Vortex dynamics.
arXiv Detail & Related papers (2024-05-22T14:36:23Z) - Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - Fully Differentiable Lagrangian Convolutional Neural Network for
Continuity-Consistent Physics-Informed Precipitation Nowcasting [0.0]
We present a convolutional neural network model for precipitation nowcasting that combines data-driven learning with physics-informed domain knowledge.
We propose LUPIN, a Lagrangian Double U-Net for Physics-Informed Nowcasting, that draws from existing extrapolation-based nowcasting methods.
Based on our evaluation, LUPIN matches and exceeds the performance of the chosen benchmark, opening the door for other Lagrangian machine learning models.
arXiv Detail & Related papers (2024-02-16T15:13:30Z) - Generalized Lagrangian Neural Networks [8.065464912030352]
We introduce a groundbreaking extension (Genralized Lagrangian Neural Networks) to Lagrangian Neural Networks (LNNs)
By leveraging the foundational importance of the Lagrangian within Lagrange's equations, we formulate the model based on the generalized Lagrange's equation.
This modification not only enhances prediction accuracy but also guarantees Lagrangian representation in non-conservative systems.
arXiv Detail & Related papers (2024-01-08T08:26:40Z) - A Bayesian framework for discovering interpretable Lagrangian of
dynamical systems from data [1.0878040851638]
We propose an alternate framework for learning interpretable Lagrangian descriptions of physical systems.
Unlike existing neural network-based approaches, the proposed approach yields an interpretable description of Lagrangian.
arXiv Detail & Related papers (2023-10-10T01:35:54Z) - Addressing caveats of neural persistence with deep graph persistence [54.424983583720675]
We find that the variance of network weights and spatial concentration of large weights are the main factors that impact neural persistence.
We propose an extension of the filtration underlying neural persistence to the whole neural network instead of single layers.
This yields our deep graph persistence measure, which implicitly incorporates persistent paths through the network and alleviates variance-related issues.
arXiv Detail & Related papers (2023-07-20T13:34:11Z) - PAC-NeRF: Physics Augmented Continuum Neural Radiance Fields for
Geometry-Agnostic System Identification [64.61198351207752]
Existing approaches to system identification (estimating the physical parameters of an object) from videos assume known object geometries.
In this work, we aim to identify parameters characterizing a physical system from a set of multi-view videos without any assumption on object geometry or topology.
We propose "Physics Augmented Continuum Neural Radiance Fields" (PAC-NeRF), to estimate both the unknown geometry and physical parameters of highly dynamic objects from multi-view videos.
arXiv Detail & Related papers (2023-03-09T18:59:50Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - Physics-informed ConvNet: Learning Physical Field from a Shallow Neural
Network [0.180476943513092]
Modelling and forecasting multi-physical systems remain a challenge due to unavoidable data scarcity and noise.
New framework named physics-informed convolutional network (PICN) is recommended from a CNN perspective.
PICN may become an alternative neural network solver in physics-informed machine learning.
arXiv Detail & Related papers (2022-01-26T14:35:58Z) - Physics informed neural networks for continuum micromechanics [68.8204255655161]
Recently, physics informed neural networks have successfully been applied to a broad variety of problems in applied mathematics and engineering.
Due to the global approximation, physics informed neural networks have difficulties in displaying localized effects and strong non-linear solutions by optimization.
It is shown, that the domain decomposition approach is able to accurately resolve nonlinear stress, displacement and energy fields in heterogeneous microstructures obtained from real-world $mu$CT-scans.
arXiv Detail & Related papers (2021-10-14T14:05:19Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.