Generating Topological Structure of Floorplans from Room Attributes
- URL: http://arxiv.org/abs/2204.12338v1
- Date: Tue, 26 Apr 2022 14:24:58 GMT
- Title: Generating Topological Structure of Floorplans from Room Attributes
- Authors: Yin Yu, Hutchcroft Will, Khosravan Naji, Boyadzhiev Ivaylo, Fu Yun,
Kang Sing Bing
- Abstract summary: We propose to extract topological information from room attributes using Iterative and adaptive graph Topology Learning (ITL)
ITL progressively predicts multiple relations between rooms; at each iteration, it improves node embeddings, which in turn facilitates generation of a better topological graph structure.
- Score: 4.1715767752637145
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Analysis of indoor spaces requires topological information. In this paper, we
propose to extract topological information from room attributes using what we
call Iterative and adaptive graph Topology Learning (ITL). ITL progressively
predicts multiple relations between rooms; at each iteration, it improves node
embeddings, which in turn facilitates generation of a better topological graph
structure. This notion of iterative improvement of node embeddings and
topological graph structure is in the same spirit as \cite{chen2020iterative}.
However, while \cite{chen2020iterative} computes the adjacency matrix based on
node similarity, we learn the graph metric using a relational decoder to
extract room correlations. Experiments using a new challenging indoor dataset
validate our proposed method. Qualitative and quantitative evaluation for
layout topology prediction and floorplan generation applications also
demonstrate the effectiveness of ITL.
Related papers
- Metric-Semantic Factor Graph Generation based on Graph Neural Networks [0.0]
In indoors, certain spatial constraints, such as the relative positioning of planes, remain consistent despite variations in layout.
This paper explores how these invariant relationships can be captured in a graph SLAM framework by representing high-level concepts like rooms and walls.
Several efforts have tackled this issue with add-hoc solutions for each concept generation and with manually-defined factors.
This paper proposes a novel method for metric-semantic factor graph generation which includes defining a semantic scene graph, integrating geometric information, and learning the interconnecting factors.
arXiv Detail & Related papers (2024-09-18T13:24:44Z) - Graph Transformer GANs with Graph Masked Modeling for Architectural
Layout Generation [153.92387500677023]
We present a novel graph Transformer generative adversarial network (GTGAN) to learn effective graph node relations.
The proposed graph Transformer encoder combines graph convolutions and self-attentions in a Transformer to model both local and global interactions.
We also propose a novel self-guided pre-training method for graph representation learning.
arXiv Detail & Related papers (2024-01-15T14:36:38Z) - Deep Manifold Graph Auto-Encoder for Attributed Graph Embedding [51.75091298017941]
This paper proposes a novel Deep Manifold (Variational) Graph Auto-Encoder (DMVGAE/DMGAE) for attributed graph data.
The proposed method surpasses state-of-the-art baseline algorithms by a significant margin on different downstream tasks across popular datasets.
arXiv Detail & Related papers (2024-01-12T17:57:07Z) - Improving embedding of graphs with missing data by soft manifolds [51.425411400683565]
The reliability of graph embeddings depends on how much the geometry of the continuous space matches the graph structure.
We introduce a new class of manifold, named soft manifold, that can solve this situation.
Using soft manifold for graph embedding, we can provide continuous spaces to pursue any task in data analysis over complex datasets.
arXiv Detail & Related papers (2023-11-29T12:48:33Z) - On Characterizing the Evolution of Embedding Space of Neural Networks
using Algebraic Topology [9.537910170141467]
We study how the topology of feature embedding space changes as it passes through the layers of a well-trained deep neural network (DNN) through Betti numbers.
We demonstrate that as depth increases, a topologically complicated dataset is transformed into a simple one, resulting in Betti numbers attaining their lowest possible value.
arXiv Detail & Related papers (2023-11-08T10:45:12Z) - Optimality of Message-Passing Architectures for Sparse Graphs [13.96547777184641]
We study the node classification problem on feature-decorated graphs in the sparse setting, i.e., when the expected degree of a node is $O(1)$ in the number of nodes.
We introduce a notion of Bayes optimality for node classification tasks, called local Bayes optimality.
We show that the optimal message-passing architecture interpolates between a standard in the regime of low graph signal and a typical in the regime of high graph signal.
arXiv Detail & Related papers (2023-05-17T17:31:20Z) - Alleviating neighbor bias: augmenting graph self-supervise learning with
structural equivalent positive samples [1.0507062889290775]
We propose a signal-driven self-supervised method for graph representation learning.
It uses a topological information-guided structural equivalence sampling strategy.
The results show that the model performance can be effectively improved.
arXiv Detail & Related papers (2022-12-08T16:04:06Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Learning to Learn Graph Topologies [27.782971146122218]
We learn a mapping from node data to the graph structure based on the idea of learning to optimise (L2O)
The model is trained in an end-to-end fashion with pairs of node data and graph samples.
Experiments on both synthetic and real-world data demonstrate that our model is more efficient than classic iterative algorithms in learning a graph with specific topological properties.
arXiv Detail & Related papers (2021-10-19T08:42:38Z) - Structured Graph Learning for Clustering and Semi-supervised
Classification [74.35376212789132]
We propose a graph learning framework to preserve both the local and global structure of data.
Our method uses the self-expressiveness of samples to capture the global structure and adaptive neighbor approach to respect the local structure.
Our model is equivalent to a combination of kernel k-means and k-means methods under certain condition.
arXiv Detail & Related papers (2020-08-31T08:41:20Z) - Building powerful and equivariant graph neural networks with structural
message-passing [74.93169425144755]
We propose a powerful and equivariant message-passing framework based on two ideas.
First, we propagate a one-hot encoding of the nodes, in addition to the features, in order to learn a local context matrix around each node.
Second, we propose methods for the parametrization of the message and update functions that ensure permutation equivariance.
arXiv Detail & Related papers (2020-06-26T17:15:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.