Neural Metamaterial Networks for Nonlinear Material Design
- URL: http://arxiv.org/abs/2309.10600v1
- Date: Fri, 15 Sep 2023 13:50:43 GMT
- Title: Neural Metamaterial Networks for Nonlinear Material Design
- Authors: Yue Li, Stelian Coros, Bernhard Thomaszewski
- Abstract summary: We propose Metamaterial Networks -- neural representations that encode the nonlinear mechanics of entire metamaterial families.
We use this approach to automatically design materials with desired strain-stress curves, prescribed directional stiffness and Poisson ratio profiles.
- Score: 29.65492571110993
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Nonlinear metamaterials with tailored mechanical properties have applications
in engineering, medicine, robotics, and beyond. While modeling their
macromechanical behavior is challenging in itself, finding structure parameters
that lead to ideal approximation of high-level performance goals is a
challenging task. In this work, we propose Neural Metamaterial Networks (NMN)
-- smooth neural representations that encode the nonlinear mechanics of entire
metamaterial families. Given structure parameters as input, NMN return
continuously differentiable strain energy density functions, thus guaranteeing
conservative forces by construction. Though trained on simulation data, NMN do
not inherit the discontinuities resulting from topological changes in finite
element meshes. They instead provide a smooth map from parameter to performance
space that is fully differentiable and thus well-suited for gradient-based
optimization. On this basis, we formulate inverse material design as a
nonlinear programming problem that leverages neural networks for both objective
functions and constraints. We use this approach to automatically design
materials with desired strain-stress curves, prescribed directional stiffness
and Poisson ratio profiles. We furthermore conduct ablation studies on network
nonlinearities and show the advantages of our approach compared to native-scale
optimization.
Related papers
- Differentiable Neural-Integrated Meshfree Method for Forward and Inverse Modeling of Finite Strain Hyperelasticity [1.290382979353427]
The present study aims to extend the novel physics-informed machine learning approach, specifically the neural-integrated meshfree (NIM) method, to model finite-strain problems.
Thanks to the inherent differentiable programming capabilities, NIM can circumvent the need for derivation of Newton-Raphson linearization of the variational form.
NIM is applied to identify heterogeneous mechanical properties of hyperelastic materials from strain data, validating its effectiveness in the inverse modeling of nonlinear materials.
arXiv Detail & Related papers (2024-07-15T19:15:18Z) - Hallmarks of Optimization Trajectories in Neural Networks: Directional Exploration and Redundancy [75.15685966213832]
We analyze the rich directional structure of optimization trajectories represented by their pointwise parameters.
We show that training only scalar batchnorm parameters some while into training matches the performance of training the entire network.
arXiv Detail & Related papers (2024-03-12T07:32:47Z) - Gaussian Process Neural Additive Models [3.7969209746164325]
We propose a new subclass of Neural Additive Models (NAMs) that use a single-layer neural network construction of the Gaussian process via random Fourier features.
GP-NAMs have the advantage of a convex objective function and number of trainable parameters that grows linearly with feature dimensionality.
We show that GP-NAM achieves comparable or better performance in both classification and regression tasks with a large reduction in the number of parameters.
arXiv Detail & Related papers (2024-02-19T20:29:34Z) - SimPINNs: Simulation-Driven Physics-Informed Neural Networks for
Enhanced Performance in Nonlinear Inverse Problems [0.0]
This paper introduces a novel approach to solve inverse problems by leveraging deep learning techniques.
The objective is to infer unknown parameters that govern a physical system based on observed data.
arXiv Detail & Related papers (2023-09-27T06:34:55Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - Optimization-Induced Graph Implicit Nonlinear Diffusion [64.39772634635273]
We propose a new kind of graph convolution variants, called Graph Implicit Diffusion (GIND)
GIND implicitly has access to infinite hops of neighbors while adaptively aggregating features with nonlinear diffusion to prevent over-smoothing.
We show that the learned representation can be formalized as the minimizer of an explicit convex optimization objective.
arXiv Detail & Related papers (2022-06-29T06:26:42Z) - Neural Operator with Regularity Structure for Modeling Dynamics Driven
by SPDEs [70.51212431290611]
Partial differential equations (SPDEs) are significant tools for modeling dynamics in many areas including atmospheric sciences and physics.
We propose the Neural Operator with Regularity Structure (NORS) which incorporates the feature vectors for modeling dynamics driven by SPDEs.
We conduct experiments on various of SPDEs including the dynamic Phi41 model and the 2d Navier-Stokes equation.
arXiv Detail & Related papers (2022-04-13T08:53:41Z) - Learning the nonlinear dynamics of soft mechanical metamaterials with
graph networks [3.609538870261841]
We propose a machine learning approach to study the dynamics of soft mechanical metamaterials.
The proposed approach can significantly reduce the computational cost when compared to direct numerical simulation.
arXiv Detail & Related papers (2022-02-24T00:20:28Z) - Physics informed neural networks for continuum micromechanics [68.8204255655161]
Recently, physics informed neural networks have successfully been applied to a broad variety of problems in applied mathematics and engineering.
Due to the global approximation, physics informed neural networks have difficulties in displaying localized effects and strong non-linear solutions by optimization.
It is shown, that the domain decomposition approach is able to accurately resolve nonlinear stress, displacement and energy fields in heterogeneous microstructures obtained from real-world $mu$CT-scans.
arXiv Detail & Related papers (2021-10-14T14:05:19Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z) - Topology optimization of 2D structures with nonlinearities using deep
learning [0.0]
Cloud computing has made it possible to search for optimal nonlinear structures.
We develop convolutional neural network models to predict optimized designs.
The developed models are capable of accurately predicting the optimized designs without requiring an iterative scheme.
arXiv Detail & Related papers (2020-01-31T12:36:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.