Efficient Graph Field Integrators Meet Point Clouds
- URL: http://arxiv.org/abs/2302.00942v6
- Date: Wed, 4 Oct 2023 19:17:43 GMT
- Title: Efficient Graph Field Integrators Meet Point Clouds
- Authors: Krzysztof Choromanski, Arijit Sehanobish, Han Lin, Yunfan Zhao, Eli
Berger, Tetiana Parshakova, Alvin Pan, David Watkins, Tianyi Zhang, Valerii
Likhosherstov, Somnath Basu Roy Chowdhury, Avinava Dubey, Deepali Jain, Tamas
Sarlos, Snigdha Chaturvedi, Adrian Weller
- Abstract summary: We present two new classes of algorithms for efficient field integration on graphs encoding point clouds.
The first class, SeparatorFactorization(SF), leverages the bounded genus of point cloud mesh graphs, while the second class, RFDiffusion(RFD), uses popular epsilon-nearest-neighbor graph representations for point clouds.
- Score: 59.27295475120132
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present two new classes of algorithms for efficient field integration on
graphs encoding point clouds. The first class, SeparatorFactorization(SF),
leverages the bounded genus of point cloud mesh graphs, while the second class,
RFDiffusion(RFD), uses popular epsilon-nearest-neighbor graph representations
for point clouds. Both can be viewed as providing the functionality of Fast
Multipole Methods (FMMs), which have had a tremendous impact on efficient
integration, but for non-Euclidean spaces. We focus on geometries induced by
distributions of walk lengths between points (e.g., shortest-path distance). We
provide an extensive theoretical analysis of our algorithms, obtaining new
results in structural graph theory as a byproduct. We also perform exhaustive
empirical evaluation, including on-surface interpolation for rigid and
deformable objects (particularly for mesh-dynamics modeling), Wasserstein
distance computations for point clouds, and the Gromov-Wasserstein variant.
Related papers
- NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - A fast topological approach for predicting anomalies in time-varying
graphs [0.0]
A persistence diagram (PD) from topological data analysis (TDA) has become a popular descriptor of shape of data with a well-defined distance between points.
This paper introduces a computationally efficient framework to extract shape information from graph data.
In a real data application, our approach provides up to 22% gain in anomalous price prediction for the cryptocurrency transaction networks.
arXiv Detail & Related papers (2023-05-11T01:54:45Z) - Condensing Graphs via One-Step Gradient Matching [50.07587238142548]
We propose a one-step gradient matching scheme, which performs gradient matching for only one single step without training the network weights.
Our theoretical analysis shows this strategy can generate synthetic graphs that lead to lower classification loss on real graphs.
In particular, we are able to reduce the dataset size by 90% while approximating up to 98% of the original performance.
arXiv Detail & Related papers (2022-06-15T18:20:01Z) - On a linear fused Gromov-Wasserstein distance for graph structured data [2.360534864805446]
We propose a novel distance between two graphs, named linearFGW, defined as the Euclidean distance between their embeddings.
The advantages of the proposed distance are twofold: 1) it can take into account node feature and structure of graphs for measuring the similarity between graphs in a kernel-based framework, 2) it can be much faster for computing kernel matrix than pairwise OT-based distances.
arXiv Detail & Related papers (2022-03-09T13:43:18Z) - Towards Efficient Graph Convolutional Networks for Point Cloud Handling [181.59146413326056]
We aim at improving the computational efficiency of graph convolutional networks (GCNs) for learning on point clouds.
A series of experiments show that optimized networks have reduced computational complexity, decreased memory consumption, and accelerated inference speed.
arXiv Detail & Related papers (2021-04-12T17:59:16Z) - Pyramidal Reservoir Graph Neural Network [18.632681846787246]
We propose a deep Graph Neural Network (GNN) model that alternates two types of layers.
We show how graph pooling can reduce the computational complexity of the model.
Our proposed approach to the design of RC-based GNNs offers an advantageous and principled trade-off between accuracy and complexity.
arXiv Detail & Related papers (2021-04-10T08:34:09Z) - Pseudoinverse Graph Convolutional Networks: Fast Filters Tailored for
Large Eigengaps of Dense Graphs and Hypergraphs [0.0]
Graph Convolutional Networks (GCNs) have proven to be successful tools for semi-supervised classification on graph-based datasets.
We propose a new GCN variant whose three-part filter space is targeted at dense graphs.
arXiv Detail & Related papers (2020-08-03T08:48:41Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Optimal Transport Graph Neural Networks [31.191844909335963]
Current graph neural network (GNN) architectures naively average or sum node embeddings into an aggregated graph representation.
We introduce OT-GNN, a model that computes graph embeddings using parametric prototypes.
arXiv Detail & Related papers (2020-06-08T14:57:39Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.