Bond Graphs for multi-physics informed Neural Networks for multi-variate time series
- URL: http://arxiv.org/abs/2405.13586v2
- Date: Fri, 2 Aug 2024 14:59:48 GMT
- Title: Bond Graphs for multi-physics informed Neural Networks for multi-variate time series
- Authors: Alexis-Raja Brachet, Pierre-Yves Richard, CĂ©line Hudelot,
- Abstract summary: Existing methods are not adapted to tasks with complex multi-physical and multi-domain phenomena.
We propose a Neural Bond graph (NBgE) producing multi-physics-informed representations that can be fed into any task-specific model.
- Score: 6.775534755081169
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In the trend of hybrid Artificial Intelligence techniques, Physical-Informed Machine Learning has seen a growing interest. It operates mainly by imposing data, learning, or architecture bias with simulation data, Partial Differential Equations, or equivariance and invariance properties. While it has shown great success on tasks involving one physical domain, such as fluid dynamics, existing methods are not adapted to tasks with complex multi-physical and multi-domain phenomena. In addition, it is mainly formulated as an end-to-end learning scheme. To address these challenges, we propose to leverage Bond Graphs, a multi-physics modeling approach, together with Message Passing Graph Neural Networks. We propose a Neural Bond graph Encoder (NBgE) producing multi-physics-informed representations that can be fed into any task-specific model. It provides a unified way to integrate both data and architecture biases in deep learning. Our experiments on two challenging multi-domain physical systems - a Direct Current Motor and the Respiratory System - demonstrate the effectiveness of our approach on a multivariate time-series forecasting task.
Related papers
- Learning System Dynamics without Forgetting [60.08612207170659]
Predicting trajectories of systems with unknown dynamics is crucial in various research fields, including physics and biology.
We present a novel framework of Mode-switching Graph ODE (MS-GODE), which can continually learn varying dynamics.
We construct a novel benchmark of biological dynamic systems, featuring diverse systems with disparate dynamics.
arXiv Detail & Related papers (2024-06-30T14:55:18Z) - Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - Learning Latent Dynamics via Invariant Decomposition and
(Spatio-)Temporal Transformers [0.6767885381740952]
We propose a method for learning dynamical systems from high-dimensional empirical data.
We focus on the setting in which data are available from multiple different instances of a system.
We study behaviour through simple theoretical analyses and extensive experiments on synthetic and real-world datasets.
arXiv Detail & Related papers (2023-06-21T07:52:07Z) - FAS-UNet: A Novel FAS-driven Unet to Learn Variational Image
Segmentation [3.741136641573471]
We propose a novel variational-model-informed network (FAS-Unet) that exploits the model and algorithm priors to extract the multi-scale features.
The proposed network integrates image data and mathematical models, and implements them through learning a few convolution kernels.
Experimental results show that the proposed FAS-Unet is very competitive with other state-of-the-art methods in qualitative, quantitative and model complexity evaluations.
arXiv Detail & Related papers (2022-10-27T04:15:16Z) - Geometric multimodal representation learning [13.159512679346687]
Multimodal learning methods fuse multiple data modalities while leveraging cross-modal dependencies to address this challenge.
We put forward an algorithmic blueprint for multimodal graph learning based on this categorization.
This effort can pave the way for standardizing the design of sophisticated multimodal architectures for highly complex real-world problems.
arXiv Detail & Related papers (2022-09-07T16:59:03Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Constructing Neural Network-Based Models for Simulating Dynamical
Systems [59.0861954179401]
Data-driven modeling is an alternative paradigm that seeks to learn an approximation of the dynamics of a system using observations of the true system.
This paper provides a survey of the different ways to construct models of dynamical systems using neural networks.
In addition to the basic overview, we review the related literature and outline the most significant challenges from numerical simulations that this modeling paradigm must overcome.
arXiv Detail & Related papers (2021-11-02T10:51:42Z) - Continuous-in-Depth Neural Networks [107.47887213490134]
We first show that ResNets fail to be meaningful dynamical in this richer sense.
We then demonstrate that neural network models can learn to represent continuous dynamical systems.
We introduce ContinuousNet as a continuous-in-depth generalization of ResNet architectures.
arXiv Detail & Related papers (2020-08-05T22:54:09Z) - Learning to Simulate Complex Physics with Graph Networks [68.43901833812448]
We present a machine learning framework and model implementation that can learn to simulate a wide variety of challenging physical domains.
Our framework---which we term "Graph Network-based Simulators" (GNS)--represents the state of a physical system with particles, expressed as nodes in a graph, and computes dynamics via learned message-passing.
Our results show that our model can generalize from single-timestep predictions with thousands of particles during training, to different initial conditions, thousands of timesteps, and at least an order of magnitude more particles at test time.
arXiv Detail & Related papers (2020-02-21T16:44:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.