Data-Free Learning of Reduced-Order Kinematics
- URL: http://arxiv.org/abs/2305.03846v1
- Date: Fri, 5 May 2023 20:53:36 GMT
- Title: Data-Free Learning of Reduced-Order Kinematics
- Authors: Nicholas Sharp, Cristian Romero, Alec Jacobson, Etienne Vouga, Paul G.
Kry, David I.W. Levin, Justin Solomon
- Abstract summary: We produce a low-dimensional map whose image parameterizes a diverse yet low-energy submanifold of configurations.
We represent subspaces as neural networks that map a low-dimensional latent vector to the full configuration space.
This formulation is effective across a very general range of physical systems.
- Score: 54.85157881323157
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Physical systems ranging from elastic bodies to kinematic linkages are
defined on high-dimensional configuration spaces, yet their typical low-energy
configurations are concentrated on much lower-dimensional subspaces. This work
addresses the challenge of identifying such subspaces automatically: given as
input an energy function for a high-dimensional system, we produce a
low-dimensional map whose image parameterizes a diverse yet low-energy
submanifold of configurations. The only additional input needed is a single
seed configuration for the system to initialize our procedure; no dataset of
trajectories is required. We represent subspaces as neural networks that map a
low-dimensional latent vector to the full configuration space, and propose a
training scheme to fit network parameters to any system of interest. This
formulation is effective across a very general range of physical systems; our
experiments demonstrate not only nonlinear and very low-dimensional elastic
body and cloth subspaces, but also more general systems like colliding rigid
bodies and linkages. We briefly explore applications built on this formulation,
including manipulation, latent interpolation, and sampling.
Related papers
- Self Supervised Networks for Learning Latent Space Representations of Human Body Scans and Motions [6.165163123577484]
This paper introduces self-supervised neural network models to tackle several fundamental problems in the field of 3D human body analysis and processing.
We propose VariShaPE, a novel architecture for the retrieval of latent space representations of body shapes and poses.
Second, we complement the estimation of latent codes with MoGeN, a framework that learns the geometry on the latent space itself.
arXiv Detail & Related papers (2024-11-05T19:59:40Z) - FLoRA: Low-Rank Core Space for N-dimension [78.39310274926535]
Adapting pre-trained foundation models for various downstream tasks has been prevalent in artificial intelligence.
To mitigate this, several fine-tuning techniques have been developed to update the pre-trained model weights in a more resource-efficient manner.
This paper introduces a generalized parameter-efficient fine-tuning framework, FLoRA, designed for various dimensional parameter space.
arXiv Detail & Related papers (2024-05-23T16:04:42Z) - Flatten Anything: Unsupervised Neural Surface Parameterization [76.4422287292541]
We introduce the Flatten Anything Model (FAM), an unsupervised neural architecture to achieve global free-boundary surface parameterization.
Compared with previous methods, our FAM directly operates on discrete surface points without utilizing connectivity information.
Our FAM is fully-automated without the need for pre-cutting and can deal with highly-complex topologies.
arXiv Detail & Related papers (2024-05-23T14:39:52Z) - Autoencoders for discovering manifold dimension and coordinates in data
from complex dynamical systems [0.0]
Autoencoder framework combines implicit regularization with internal linear layers and $L$ regularization (weight decay)
We show that this framework can be naturally extended for applications of state-space modeling and forecasting.
arXiv Detail & Related papers (2023-05-01T21:14:47Z) - Neural Motion Fields: Encoding Grasp Trajectories as Implicit Value
Functions [65.84090965167535]
We present Neural Motion Fields, a novel object representation which encodes both object point clouds and the relative task trajectories as an implicit value function parameterized by a neural network.
This object-centric representation models a continuous distribution over the SE(3) space and allows us to perform grasping reactively by leveraging sampling-based MPC to optimize this value function.
arXiv Detail & Related papers (2022-06-29T18:47:05Z) - Neural-Network Quantum States for Periodic Systems in Continuous Space [66.03977113919439]
We introduce a family of neural quantum states for the simulation of strongly interacting systems in the presence of periodicity.
For one-dimensional systems we find very precise estimations of the ground-state energies and the radial distribution functions of the particles.
In two dimensions we obtain good estimations of the ground-state energies, comparable to results obtained from more conventional methods.
arXiv Detail & Related papers (2021-12-22T15:27:30Z) - Nested Hyperbolic Spaces for Dimensionality Reduction and Hyperbolic NN
Design [8.250374560598493]
Hyperbolic neural networks have been popular in the recent past due to their ability to represent hierarchical data sets effectively and efficiently.
The challenge in developing these networks lies in the nonlinearity of the embedding space namely, the Hyperbolic space.
We present a novel fully hyperbolic neural network which uses the concept of projections (embeddings) followed by an intrinsic aggregation and a nonlinearity all within the hyperbolic space.
arXiv Detail & Related papers (2021-12-03T03:20:27Z) - Adaptive Machine Learning for Time-Varying Systems: Low Dimensional
Latent Space Tuning [91.3755431537592]
We present a recently developed method of adaptive machine learning for time-varying systems.
Our approach is to map very high (N>100k) dimensional inputs into the low dimensional (N2) latent space at the output of the encoder section of an encoder-decoder CNN.
This method allows us to learn correlations within and to track their evolution in real time based on feedback without interrupts.
arXiv Detail & Related papers (2021-07-13T16:05:28Z) - Physics-aware registration based auto-encoder for convection dominated
PDEs [6.85316573653194]
We propose a physics-aware auto-encoder to specifically reduce the dimensionality of solutions arising from convection-dominated nonlinear physical systems.
We demonstrate the efficacy and interpretability of our approach to separate convection/advection from diffusion/scaling on various manufactured and physical systems.
arXiv Detail & Related papers (2020-06-28T16:58:21Z) - A Tailored Convolutional Neural Network for Nonlinear Manifold Learning
of Computational Physics Data using Unstructured Spatial Discretizations [0.0]
We propose a nonlinear manifold learning technique based on deep convolutional autoencoders.
The technique is appropriate for model order reduction of physical systems in complex geometries.
arXiv Detail & Related papers (2020-06-11T02:19:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.