Geometric Algebra Attention Networks for Small Point Clouds
- URL: http://arxiv.org/abs/2110.02393v1
- Date: Tue, 5 Oct 2021 22:52:12 GMT
- Title: Geometric Algebra Attention Networks for Small Point Clouds
- Authors: Matthew Spellings
- Abstract summary: Problems in the physical sciences deal with relatively small sets of points in two- or three-dimensional space.
We present rotation- and permutation-equivariant architectures for deep learning on these small point clouds.
We demonstrate the usefulness of these architectures by training models to solve sample problems relevant to physics, chemistry, and biology.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Much of the success of deep learning is drawn from building architectures
that properly respect underlying symmetry and structure in the data on which
they operate - a set of considerations that have been united under the banner
of geometric deep learning. Often problems in the physical sciences deal with
relatively small sets of points in two- or three-dimensional space wherein
translation, rotation, and permutation equivariance are important or even vital
for models to be useful in practice. In this work, we present rotation- and
permutation-equivariant architectures for deep learning on these small point
clouds, composed of a set of products of terms from the geometric algebra and
reductions over those products using an attention mechanism. The geometric
algebra provides valuable mathematical structure by which to combine vector,
scalar, and other types of geometric inputs in a systematic way to account for
rotation invariance or covariance, while attention yields a powerful way to
impose permutation equivariance. We demonstrate the usefulness of these
architectures by training models to solve sample problems relevant to physics,
chemistry, and biology.
Related papers
- Current Symmetry Group Equivariant Convolution Frameworks for Representation Learning [5.802794302956837]
Euclidean deep learning is often inadequate for addressing real-world signals where the representation space is irregular and curved with complex topologies.
We focus on the importance of symmetry group equivariant deep learning models and their realization of convolution-like operations on graphs, 3D shapes, and non-Euclidean spaces.
arXiv Detail & Related papers (2024-09-11T15:07:18Z) - A hybrid numerical methodology coupling Reduced Order Modeling and Graph Neural Networks for non-parametric geometries: applications to structural dynamics problems [0.0]
This work introduces a new approach for accelerating the numerical analysis of time-domain partial differential equations (PDEs) governing complex physical systems.
The methodology is based on a combination of a classical reduced-order modeling (ROM) framework and recently-parametric Graph Neural Networks (GNNs)
arXiv Detail & Related papers (2024-06-03T08:51:25Z) - A Survey of Geometric Graph Neural Networks: Data Structures, Models and
Applications [67.33002207179923]
This paper presents a survey of data structures, models, and applications related to geometric GNNs.
We provide a unified view of existing models from the geometric message passing perspective.
We also summarize the applications as well as the related datasets to facilitate later research for methodology development and experimental evaluation.
arXiv Detail & Related papers (2024-03-01T12:13:04Z) - A Hitchhiker's Guide to Geometric GNNs for 3D Atomic Systems [87.30652640973317]
Recent advances in computational modelling of atomic systems represent them as geometric graphs with atoms embedded as nodes in 3D Euclidean space.
Geometric Graph Neural Networks have emerged as the preferred machine learning architecture powering applications ranging from protein structure prediction to molecular simulations and material generation.
This paper provides a comprehensive and self-contained overview of the field of Geometric GNNs for 3D atomic systems.
arXiv Detail & Related papers (2023-12-12T18:44:19Z) - Unraveling the Single Tangent Space Fallacy: An Analysis and Clarification for Applying Riemannian Geometry in Robot Learning [6.253089330116833]
Handling geometric constraints effectively requires the incorporation of tools from differential geometry into the formulation of machine learning methods.
Recent adoption in robot learning has been largely characterized by a mathematically-flawed simplification.
This paper provides a theoretical elucidation of various misconceptions surrounding this approach and offers experimental evidence of its shortcomings.
arXiv Detail & Related papers (2023-10-11T21:16:01Z) - Symmetry-Informed Geometric Representation for Molecules, Proteins, and
Crystalline Materials [66.14337835284628]
We propose a platform, coined Geom3D, which enables benchmarking the effectiveness of geometric strategies.
Geom3D contains 16 advanced symmetry-informed geometric representation models and 14 geometric pretraining methods over 46 diverse datasets.
arXiv Detail & Related papers (2023-06-15T05:37:25Z) - Algebraic Machine Learning with an Application to Chemistry [0.0]
We develop a machine learning pipeline that captures fine-grain geometric information without relying on smoothness assumptions.
In particular, we propose a for numerically detecting points lying near the singular locus of the underlying variety.
arXiv Detail & Related papers (2022-05-11T22:41:19Z) - Symmetry Group Equivariant Architectures for Physics [52.784926970374556]
In the domain of machine learning, an awareness of symmetries has driven impressive performance breakthroughs.
We argue that both the physics community and the broader machine learning community have much to understand.
arXiv Detail & Related papers (2022-03-11T18:27:04Z) - Frame Averaging for Equivariant Shape Space Learning [85.42901997467754]
A natural way to incorporate symmetries in shape space learning is to ask that the mapping to the shape space (encoder) and mapping from the shape space (decoder) are equivariant to the relevant symmetries.
We present a framework for incorporating equivariance in encoders and decoders by introducing two contributions.
arXiv Detail & Related papers (2021-12-03T06:41:19Z) - Hermitian Symmetric Spaces for Graph Embeddings [0.0]
We learn continuous representations of graphs in spaces of symmetric matrices over C.
These spaces offer a rich geometry that simultaneously admits hyperbolic and Euclidean subspaces.
The proposed models are able to automatically adapt to very dissimilar arrangements without any apriori estimates of graph features.
arXiv Detail & Related papers (2021-05-11T18:14:52Z) - Quadric hypersurface intersection for manifold learning in feature space [52.83976795260532]
manifold learning technique suitable for moderately high dimension and large datasets.
The technique is learned from the training data in the form of an intersection of quadric hypersurfaces.
At test time, this manifold can be used to introduce an outlier score for arbitrary new points.
arXiv Detail & Related papers (2021-02-11T18:52:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.