Predicting Physics in Mesh-reduced Space with Temporal Attention
- URL: http://arxiv.org/abs/2201.09113v1
- Date: Sat, 22 Jan 2022 18:32:54 GMT
- Title: Predicting Physics in Mesh-reduced Space with Temporal Attention
- Authors: Xu Han and Han Gao and Tobias Pffaf and Jian-Xun Wang and Li-Ping Liu
- Abstract summary: We propose a new method that captures long-term dependencies through a transformer-style temporal attention model.
Our method outperforms a competitive GNN baseline on several complex fluid dynamics prediction tasks.
We believe our approach paves the way to bringing the benefits of attention-based sequence models to solving high-dimensional complex physics tasks.
- Score: 15.054026802351146
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph-based next-step prediction models have recently been very successful in
modeling complex high-dimensional physical systems on irregular meshes.
However, due to their short temporal attention span, these models suffer from
error accumulation and drift. In this paper, we propose a new method that
captures long-term dependencies through a transformer-style temporal attention
model. We introduce an encoder-decoder structure to summarize features and
create a compact mesh representation of the system state, to allow the temporal
model to operate on a low-dimensional mesh representations in a memory
efficient manner. Our method outperforms a competitive GNN baseline on several
complex fluid dynamics prediction tasks, from sonic shocks to vascular flow. We
demonstrate stable rollouts without the need for training noise and show
perfectly phase-stable predictions even for very long sequences. More broadly,
we believe our approach paves the way to bringing the benefits of
attention-based sequence models to solving high-dimensional complex physics
tasks.
Related papers
- Dynamic metastability in the self-attention model [22.689695473655906]
We consider the self-attention model - an interacting particle system on the unit sphere - which serves as a toy model for Transformers.
We prove the appearance of dynamic metastability conjectured in [GLPR23].
We show that under an appropriate time-rescaling, the energy reaches its global maximum in finite time and has a staircase profile.
arXiv Detail & Related papers (2024-10-09T12:50:50Z) - Dynamical system prediction from sparse observations using deep neural networks with Voronoi tessellation and physics constraint [12.638698799995815]
We introduce the Dynamic System Prediction from Sparse Observations using Voronoi Tessellation (DSOVT) framework.
By integrating Voronoi tessellations with deep learning models, DSOVT is adept at predicting dynamical systems with sparse, unstructured observations.
Compared to purely data-driven models, our physics-based approach enables the model to learn physical laws within explicitly formulated dynamics.
arXiv Detail & Related papers (2024-08-31T13:43:52Z) - Temporally-Consistent Koopman Autoencoders for Forecasting Dynamical Systems [42.6886113798806]
We introduce the Temporally-Consistent Koopman Autoencoder (tcKAE)
tcKAE generates accurate long-term predictions even with constrained and noisy training data.
We demonstrate tcKAE's superior performance over state-of-the-art KAE models across a variety of test cases.
arXiv Detail & Related papers (2024-03-19T00:48:25Z) - Attractor Memory for Long-Term Time Series Forecasting: A Chaos Perspective [63.60312929416228]
textbftextitAttraos incorporates chaos theory into long-term time series forecasting.
We show that Attraos outperforms various LTSF methods on mainstream datasets and chaotic datasets with only one-twelfth of the parameters compared to PatchTST.
arXiv Detail & Related papers (2024-02-18T05:35:01Z) - Learning Robust Precipitation Forecaster by Temporal Frame Interpolation [65.5045412005064]
We develop a robust precipitation forecasting model that demonstrates resilience against spatial-temporal discrepancies.
Our approach has led to significant improvements in forecasting precision, culminating in our model securing textit1st place in the transfer learning leaderboard of the textitWeather4cast'23 competition.
arXiv Detail & Related papers (2023-11-30T08:22:08Z) - Generative Modeling with Phase Stochastic Bridges [49.4474628881673]
Diffusion models (DMs) represent state-of-the-art generative models for continuous inputs.
We introduce a novel generative modeling framework grounded in textbfphase space dynamics
Our framework demonstrates the capability to generate realistic data points at an early stage of dynamics propagation.
arXiv Detail & Related papers (2023-10-11T18:38:28Z) - Evolve Smoothly, Fit Consistently: Learning Smooth Latent Dynamics For
Advection-Dominated Systems [14.553972457854517]
We present a data-driven, space-time continuous framework to learn surrogatemodels for complex physical systems.
We leverage the expressive power of the network and aspecially designed consistency-inducing regularization to obtain latent trajectories that are both low-dimensional and smooth.
arXiv Detail & Related papers (2023-01-25T03:06:03Z) - Data-driven low-dimensional dynamic model of Kolmogorov flow [0.0]
Reduced order models (ROMs) that capture flow dynamics are of interest for decreasing computational costs for simulation.
This work presents a data-driven framework for minimal-dimensional models that effectively capture the dynamics and properties of the flow.
We apply this to Kolmogorov flow in a regime consisting of chaotic and intermittent behavior.
arXiv Detail & Related papers (2022-10-29T23:05:39Z) - Physics-Inspired Temporal Learning of Quadrotor Dynamics for Accurate
Model Predictive Trajectory Tracking [76.27433308688592]
Accurately modeling quadrotor's system dynamics is critical for guaranteeing agile, safe, and stable navigation.
We present a novel Physics-Inspired Temporal Convolutional Network (PI-TCN) approach to learning quadrotor's system dynamics purely from robot experience.
Our approach combines the expressive power of sparse temporal convolutions and dense feed-forward connections to make accurate system predictions.
arXiv Detail & Related papers (2022-06-07T13:51:35Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.