ElastoGen: 4D Generative Elastodynamics
- URL: http://arxiv.org/abs/2405.15056v1
- Date: Thu, 23 May 2024 21:09:36 GMT
- Title: ElastoGen: 4D Generative Elastodynamics
- Authors: Yutao Feng, Yintong Shang, Xiang Feng, Lei Lan, Shandian Zhe, Tianjia Shao, Hongzhi Wu, Kun Zhou, Hao Su, Chenfanfu Jiang, Yin Yang,
- Abstract summary: ElastoGen is a knowledge-driven model that generates physically accurate and coherent 4D elastodynamics.
It learns from established physical knowledge, such as partial differential equations and their numerical solutions.
- Score: 59.20029207991106
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present ElastoGen, a knowledge-driven model that generates physically accurate and coherent 4D elastodynamics. Instead of relying on petabyte-scale data-driven learning, ElastoGen leverages the principles of physics-in-the-loop and learns from established physical knowledge, such as partial differential equations and their numerical solutions. The core idea of ElastoGen is converting the global differential operator, corresponding to the nonlinear elastodynamic equations, into iterative local convolution-like operations, which naturally fit modern neural networks. Each network module is specifically designed to support this goal rather than functioning as a black box. As a result, ElastoGen is exceptionally lightweight in terms of both training requirements and network scale. Additionally, due to its alignment with physical procedures, ElastoGen efficiently generates accurate dynamics for a wide range of hyperelastic materials and can be easily integrated with upstream and downstream deep modules to enable end-to-end 4D generation.
Related papers
- Flextron: Many-in-One Flexible Large Language Model [85.93260172698398]
We introduce Flextron, a network architecture and post-training model optimization framework supporting flexible model deployment.
We present a sample-efficient training method and associated routing algorithms for transforming an existing trained LLM into a Flextron model.
We demonstrate superior performance over multiple end-to-end trained variants and other state-of-the-art elastic networks, all with a single pretraining run that consumes a mere 7.63% tokens compared to original pretraining.
arXiv Detail & Related papers (2024-06-11T01:16:10Z) - Data-driven low-dimensional model of a sedimenting flexible fiber [0.0]
This work describes a data-driven technique to create high-fidelity low-dimensional models of flexible fiber dynamics using machine learning.
The approach combines an autoencoder neural network architecture to learn a low-dimensional latent representation of the filament shape.
We show that our data-driven model can accurately forecast the evolution of a fiber at both trained and untrained elasto-gravitational numbers.
arXiv Detail & Related papers (2024-05-16T21:07:09Z) - PIE-NeRF: Physics-based Interactive Elastodynamics with NeRF [29.6350855891474]
We show that physics-based simulations can be seamlessly integrated with NeRF to generate high-quality elastodynamics of real-world objects.
A quadratic generalized moving least square (Q-GMLS) is employed to capture nonlinear dynamics and large deformation on the implicit model.
We adaptively place the least-square kernels according to the NeRF density field to significantly reduce the complexity of the nonlinear simulation.
arXiv Detail & Related papers (2023-11-22T01:58:26Z) - Learning Physical Dynamics with Subequivariant Graph Neural Networks [99.41677381754678]
Graph Neural Networks (GNNs) have become a prevailing tool for learning physical dynamics.
Physical laws abide by symmetry, which is a vital inductive bias accounting for model generalization.
Our model achieves on average over 3% enhancement in contact prediction accuracy across 8 scenarios on Physion and 2X lower rollout MSE on RigidFall.
arXiv Detail & Related papers (2022-10-13T10:00:30Z) - NN-EUCLID: deep-learning hyperelasticity without stress data [0.0]
We propose a new approach for unsupervised learning of hyperelastic laws with physics-consistent deep neural networks.
In contrast to supervised learning, which assumes the stress-strain, the approach only uses realistically measurable full-elastic field displacement and global force availability data.
arXiv Detail & Related papers (2022-05-04T13:54:54Z) - A physics-informed deep neural network for surrogate modeling in
classical elasto-plasticity [0.0]
We present a deep neural network architecture that can efficiently approximate classical elasto-plastic relations.
The network is enriched with crucial physics aspects of classical elasto-plasticity, including additive decomposition of strains into elastic and plastic parts.
We show that embedding these physics into the architecture of the neural network facilitates a more efficient training of the network with less training data.
arXiv Detail & Related papers (2022-04-26T05:58:13Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Meta-learning using privileged information for dynamics [66.32254395574994]
We extend the Neural ODE Process model to use additional information within the Learning Using Privileged Information setting.
We validate our extension with experiments showing improved accuracy and calibration on simulated dynamics tasks.
arXiv Detail & Related papers (2021-04-29T12:18:02Z) - Physics-Integrated Variational Autoencoders for Robust and Interpretable
Generative Modeling [86.9726984929758]
We focus on the integration of incomplete physics models into deep generative models.
We propose a VAE architecture in which a part of the latent space is grounded by physics.
We demonstrate generative performance improvements over a set of synthetic and real-world datasets.
arXiv Detail & Related papers (2021-02-25T20:28:52Z) - Tensor network approaches for learning non-linear dynamical laws [0.0]
We show that various physical constraints can be captured via tensor network based parameterizations for the governing equation.
We provide a physics-informed approach to recovering structured dynamical laws from data, which adaptively balances the need for expressivity and scalability.
arXiv Detail & Related papers (2020-02-27T19:02:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.