Semisupervised regression in latent structure networks on unknown
manifolds
- URL: http://arxiv.org/abs/2305.02473v1
- Date: Thu, 4 May 2023 00:41:04 GMT
- Title: Semisupervised regression in latent structure networks on unknown
manifolds
- Authors: Aranyak Acharyya, Joshua Agterberg, Michael W. Trosset, Youngser Park,
Carey E. Priebe
- Abstract summary: We consider random dot product graphs, in which an edge is formed between two nodes with probability given by the inner product of their respective latent positions.
We propose a manifold learning and graph embedding technique to predict the response variable on out-of-sample nodes.
- Score: 7.5722195869569
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Random graphs are increasingly becoming objects of interest for modeling
networks in a wide range of applications. Latent position random graph models
posit that each node is associated with a latent position vector, and that
these vectors follow some geometric structure in the latent space. In this
paper, we consider random dot product graphs, in which an edge is formed
between two nodes with probability given by the inner product of their
respective latent positions. We assume that the latent position vectors lie on
an unknown one-dimensional curve and are coupled with a response covariate via
a regression model. Using the geometry of the underlying latent position
vectors, we propose a manifold learning and graph embedding technique to
predict the response variable on out-of-sample nodes, and we establish
convergence guarantees for these responses. Our theoretical results are
supported by simulations and an application to Drosophila brain data.
Related papers
- von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Reconstructing the Geometry of Random Geometric Graphs [9.004991291124096]
Random geometric graphs are random graph models defined on metric spaces.
We show how to efficiently reconstruct the geometry of the underlying space from the sampled graph.
arXiv Detail & Related papers (2024-02-14T21:34:44Z) - Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - Conformal Isometry of Lie Group Representation in Recurrent Network of
Grid Cells [52.425628028229156]
We study the properties of grid cells using recurrent network models.
We focus on a simple non-linear recurrent model that underlies the continuous attractor neural networks of grid cells.
arXiv Detail & Related papers (2022-10-06T05:26:49Z) - An Interpretable Graph Generative Model with Heterophily [38.59200985962146]
We propose the first edge-independent graph generative model that is expressive enough to capture heterophily.
Our experiments demonstrate the effectiveness of our model for a variety of important application tasks.
arXiv Detail & Related papers (2021-11-04T17:34:39Z) - A Probabilistic Graphical Model Approach to the Structure-and-Motion
Problem [2.2559617939136505]
We present a means of formulating and solving the well known structure-and-motion problem in computer vision.
We model the unknown camera poses and 3D feature coordinates as well as the observed 2D projections as Gaussian random variables.
We find that our approach shows promise in both simulation and on real-world data.
arXiv Detail & Related papers (2021-10-07T21:04:38Z) - Pseudo-Euclidean Attract-Repel Embeddings for Undirected Graphs [73.0261182389643]
Dot product embeddings take a graph and construct vectors for nodes such that dot products between two vectors give the strength of the edge.
We remove the transitivity assumption by embedding nodes into a pseudo-Euclidean space.
Pseudo-Euclidean embeddings can compress networks efficiently, allow for multiple notions of nearest neighbors each with their own interpretation, and can be slotted' into existing models.
arXiv Detail & Related papers (2021-06-17T17:23:56Z) - A Differential Geometry Perspective on Orthogonal Recurrent Models [56.09491978954866]
We employ tools and insights from differential geometry to offer a novel perspective on orthogonal RNNs.
We show that orthogonal RNNs may be viewed as optimizing in the space of divergence-free vector fields.
Motivated by this observation, we study a new recurrent model, which spans the entire space of vector fields.
arXiv Detail & Related papers (2021-02-18T19:39:22Z) - Random Geometric Graphs on Euclidean Balls [2.28438857884398]
We consider a latent space model for random graphs where a node $i$ is associated to a random latent point $X_i$ on the Euclidean unit ball.
For certain link functions, the model considered here generates graphs with degree distribution that have tails with a power-law-type distribution.
arXiv Detail & Related papers (2020-10-26T17:21:57Z) - The multilayer random dot product graph [6.722870980553432]
We present a comprehensive extension of the latent position network model known as the random dot product graph.
We propose a method for jointly embedding submatrices into a suitable latent space.
Empirical improvements in link prediction over single graph embeddings are exhibited in a cyber-security example.
arXiv Detail & Related papers (2020-07-20T20:31:39Z) - Understanding Graph Neural Networks with Generalized Geometric
Scattering Transforms [67.88675386638043]
The scattering transform is a multilayered wavelet-based deep learning architecture that acts as a model of convolutional neural networks.
We introduce windowed and non-windowed geometric scattering transforms for graphs based upon a very general class of asymmetric wavelets.
We show that these asymmetric graph scattering transforms have many of the same theoretical guarantees as their symmetric counterparts.
arXiv Detail & Related papers (2019-11-14T17:23:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.