Banana: Banach Fixed-Point Network for Pointcloud Segmentation with
Inter-Part Equivariance
- URL: http://arxiv.org/abs/2305.16314v2
- Date: Fri, 26 May 2023 14:28:26 GMT
- Title: Banana: Banach Fixed-Point Network for Pointcloud Segmentation with
Inter-Part Equivariance
- Authors: Congyue Deng, Jiahui Lei, Bokui Shen, Kostas Daniilidis, Leonidas
Guibas
- Abstract summary: In this paper, we present Banana, a Banach fixed-point network for equivariant segmentation with inter-part equivariance by construction.
Our key insight is to iteratively solve a fixed-point problem, where point-part assignment labels and per-part SE(3)-equivariance co-evolve simultaneously.
Our formulation naturally provides a strict definition of inter-part equivariance that generalizes to unseen inter-part configurations.
- Score: 31.875925637190328
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Equivariance has gained strong interest as a desirable network property that
inherently ensures robust generalization. However, when dealing with complex
systems such as articulated objects or multi-object scenes, effectively
capturing inter-part transformations poses a challenge, as it becomes entangled
with the overall structure and local transformations. The interdependence of
part assignment and per-part group action necessitates a novel equivariance
formulation that allows for their co-evolution. In this paper, we present
Banana, a Banach fixed-point network for equivariant segmentation with
inter-part equivariance by construction. Our key insight is to iteratively
solve a fixed-point problem, where point-part assignment labels and per-part
SE(3)-equivariance co-evolve simultaneously. We provide theoretical derivations
of both per-step equivariance and global convergence, which induces an
equivariant final convergent state. Our formulation naturally provides a strict
definition of inter-part equivariance that generalizes to unseen inter-part
configurations. Through experiments conducted on both articulated objects and
multi-object scans, we demonstrate the efficacy of our approach in achieving
strong generalization under inter-part transformations, even when confronted
with substantial changes in pointcloud geometry and topology.
Related papers
- Approximately Piecewise E(3) Equivariant Point Networks [30.619367345806438]
We introduce APEN: a framework for constructing approximate piecewise-$E(3)$ equivariant point networks.
Our primary insight is that functions that are equivariant with respect to a finer partition will also maintain equivariance in relation to the true partition.
We demonstrate the effectiveness of APEN using two data types exemplifying part-based symmetry.
arXiv Detail & Related papers (2024-02-13T15:34:39Z) - Almost Equivariance via Lie Algebra Convolutions [0.0]
We provide a definition of almost equivariance and give a practical method for encoding it in models.
Specifically, we define Lie algebra convolutions and demonstrate that they offer several benefits over Lie group convolutions.
We prove two existence theorems, one showing the existence of almost isometries within bounded distance of isometries of a manifold, and another showing the converse for Hilbert spaces.
arXiv Detail & Related papers (2023-10-19T21:31:11Z) - Deep Neural Networks with Efficient Guaranteed Invariances [77.99182201815763]
We address the problem of improving the performance and in particular the sample complexity of deep neural networks.
Group-equivariant convolutions are a popular approach to obtain equivariant representations.
We propose a multi-stream architecture, where each stream is invariant to a different transformation.
arXiv Detail & Related papers (2023-03-02T20:44:45Z) - A PAC-Bayesian Generalization Bound for Equivariant Networks [15.27608414735815]
We derive norm-based PAC-Bayesian generalization bounds for equivariant networks.
The bound characterizes the impact of group size, and multiplicity and degree of irreducible representations on the generalization error.
In general, the bound indicates that using larger group size in the model improves the generalization error substantiated by extensive numerical experiments.
arXiv Detail & Related papers (2022-10-24T12:07:03Z) - Equivariant Disentangled Transformation for Domain Generalization under
Combination Shift [91.38796390449504]
Combinations of domains and labels are not observed during training but appear in the test environment.
We provide a unique formulation of the combination shift problem based on the concepts of homomorphism, equivariance, and a refined definition of disentanglement.
arXiv Detail & Related papers (2022-08-03T12:31:31Z) - Equivariance Discovery by Learned Parameter-Sharing [153.41877129746223]
We study how to discover interpretable equivariances from data.
Specifically, we formulate this discovery process as an optimization problem over a model's parameter-sharing schemes.
Also, we theoretically analyze the method for Gaussian data and provide a bound on the mean squared gap between the studied discovery scheme and the oracle scheme.
arXiv Detail & Related papers (2022-04-07T17:59:19Z) - Capacity of Group-invariant Linear Readouts from Equivariant
Representations: How Many Objects can be Linearly Classified Under All
Possible Views? [21.06669693699965]
We find that the fraction of separable dichotomies is determined by the dimension of the space that is fixed by the group action.
We show how this relation extends to operations such as convolutions, element-wise nonlinearities, and global and local pooling.
arXiv Detail & Related papers (2021-10-14T15:46:53Z) - Coordinate Independent Convolutional Networks -- Isometry and Gauge
Equivariant Convolutions on Riemannian Manifolds [70.32518963244466]
A major complication in comparison to flat spaces is that it is unclear in which alignment a convolution kernel should be applied on a manifold.
We argue that the particular choice of coordinatization should not affect a network's inference -- it should be coordinate independent.
A simultaneous demand for coordinate independence and weight sharing is shown to result in a requirement on the network to be equivariant.
arXiv Detail & Related papers (2021-06-10T19:54:19Z) - LieTransformer: Equivariant self-attention for Lie Groups [49.9625160479096]
Group equivariant neural networks are used as building blocks of group invariant neural networks.
We extend the scope of the literature to self-attention, that is emerging as a prominent building block of deep learning models.
We propose the LieTransformer, an architecture composed of LieSelfAttention layers that are equivariant to arbitrary Lie groups and their discrete subgroups.
arXiv Detail & Related papers (2020-12-20T11:02:49Z) - Generalizing Convolutional Neural Networks for Equivariance to Lie
Groups on Arbitrary Continuous Data [52.78581260260455]
We propose a general method to construct a convolutional layer that is equivariant to transformations from any specified Lie group.
We apply the same model architecture to images, ball-and-stick molecular data, and Hamiltonian dynamical systems.
arXiv Detail & Related papers (2020-02-25T17:40:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.