Physics Constrained Flow Neural Network for Short-Timescale Predictions
in Data Communications Networks
- URL: http://arxiv.org/abs/2112.12321v3
- Date: Sun, 2 Apr 2023 09:55:37 GMT
- Title: Physics Constrained Flow Neural Network for Short-Timescale Predictions
in Data Communications Networks
- Authors: Xiangle Cheng, James He, Shihan Xiao, Yingxue Zhang, Zhitang Chen,
Pascal Poupart, Fenglin Li
- Abstract summary: This paper introduces Flow Neural Network (FlowNN) to improve the feature representation with learned physical bias.
FlowNN achieves 17% - 71% of loss decrease than the state-of-the-art baselines on both synthetic and real-world networking datasets.
- Score: 31.85361736992165
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Machine learning is gaining growing momentum in various recent models for the
dynamic analysis of information flows in data communications networks. These
preliminary models often rely on off-the-shelf learning models to predict from
historical statistics while disregarding the physics governing the generating
behaviors of these flows. This paper instead introduces Flow Neural Network
(FlowNN) to improve the feature representation with learned physical bias. This
is implemented by an induction layer, working upon the embedding layer, to
impose the physics connected data correlations, and a self-supervised learning
strategy with stop-gradient to make the learned physics universal. For the
short-timescale network prediction tasks, FlowNN achieves 17% - 71% of loss
decrease than the state-of-the-art baselines on both synthetic and real-world
networking datasets, which shows the strength of this new approach.
Related papers
- TG-PhyNN: An Enhanced Physically-Aware Graph Neural Network framework for forecasting Spatio-Temporal Data [3.268628956733623]
This work presents TG-PhyNN, a novel Temporal Graph Physics-Informed Neural Network framework.
TG-PhyNN leverages the power of GNNs for graph-based modeling while simultaneously incorporating physical constraints as a guiding principle during training.
Our findings demonstrate that TG-PhyNN significantly outperforms traditional forecasting models.
TG-PhyNN effectively exploits to offer more reliable and accurate forecasts in various domains where physical processes govern the dynamics of data.
arXiv Detail & Related papers (2024-08-29T09:41:17Z) - Contrastive Representation Learning for Dynamic Link Prediction in Temporal Networks [1.9389881806157312]
We introduce a self-supervised method for learning representations of temporal networks.
We propose a recurrent message-passing neural network architecture for modeling the information flow over time-respecting paths of temporal networks.
The proposed method is tested on Enron, COLAB, and Facebook datasets.
arXiv Detail & Related papers (2024-08-22T22:50:46Z) - Physics-guided Active Sample Reweighting for Urban Flow Prediction [75.24539704456791]
Urban flow prediction is a nuanced-temporal modeling that estimates the throughput of transportation services like buses, taxis and ride-driven models.
Some recent prediction solutions bring remedies with the notion of physics-guided machine learning (PGML)
We develop a atized physics-guided network (PN), and propose a data-aware framework Physics-guided Active Sample Reweighting (P-GASR)
arXiv Detail & Related papers (2024-07-18T15:44:23Z) - Data-Driven Dynamic Friction Models based on Recurrent Neural Networks [0.0]
Recurrent Neural Networks (RNNs) based on Gated Recurrent Unit (GRU) architecture, learn complex dynamics of rate-and-state friction laws from synthetic data.
It is found that the GRU-based RNNs effectively learns to predict changes in the friction coefficient resulting from velocity jumps.
arXiv Detail & Related papers (2024-02-21T22:11:01Z) - Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - Online Evolutionary Neural Architecture Search for Multivariate
Non-Stationary Time Series Forecasting [72.89994745876086]
This work presents the Online Neuro-Evolution-based Neural Architecture Search (ONE-NAS) algorithm.
ONE-NAS is a novel neural architecture search method capable of automatically designing and dynamically training recurrent neural networks (RNNs) for online forecasting tasks.
Results demonstrate that ONE-NAS outperforms traditional statistical time series forecasting methods.
arXiv Detail & Related papers (2023-02-20T22:25:47Z) - STDEN: Towards Physics-Guided Neural Networks for Traffic Flow
Prediction [31.49270000605409]
The lack of integration between physical principles and data-driven models is an important reason for limiting the development of this field.
We propose a physics-guided deep learning model named Spatio-Temporal Differential Equation Network (STDEN), which casts the physical mechanism of traffic flow dynamics into a deep neural network framework.
Experiments on three real-world traffic datasets in Beijing show that our model outperforms state-of-the-art baselines by a significant margin.
arXiv Detail & Related papers (2022-09-01T04:58:18Z) - Physics-Inspired Temporal Learning of Quadrotor Dynamics for Accurate
Model Predictive Trajectory Tracking [76.27433308688592]
Accurately modeling quadrotor's system dynamics is critical for guaranteeing agile, safe, and stable navigation.
We present a novel Physics-Inspired Temporal Convolutional Network (PI-TCN) approach to learning quadrotor's system dynamics purely from robot experience.
Our approach combines the expressive power of sparse temporal convolutions and dense feed-forward connections to make accurate system predictions.
arXiv Detail & Related papers (2022-06-07T13:51:35Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Thermodynamics-based Artificial Neural Networks for constitutive
modeling [0.0]
We propose a new class of data-driven, physics-based, neural networks for modeling of strain rate independent processes at the material point level.
The two basic principles of thermodynamics are encoded in the network's architecture by taking advantage of automatic differentiation.
We demonstrate the wide applicability of TANNs for modeling elasto-plastic materials, with strain hardening and softening strain.
arXiv Detail & Related papers (2020-05-25T15:56:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.