Unsupervised Graph Neural Architecture Search with Disentangled
Self-supervision
- URL: http://arxiv.org/abs/2403.05064v1
- Date: Fri, 8 Mar 2024 05:23:55 GMT
- Title: Unsupervised Graph Neural Architecture Search with Disentangled
Self-supervision
- Authors: Zeyang Zhang, Xin Wang, Ziwei Zhang, Guangyao Shen, Shiqi Shen, Wenwu
Zhu
- Abstract summary: Unsupervised graph neural architecture search remains unexplored in the literature.
We propose a novel Disentangled Self-supervised Graph Neural Architecture Search model.
Our model is able to achieve state-of-the-art performance against several baseline methods in an unsupervised manner.
- Score: 51.88848982611515
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: The existing graph neural architecture search (GNAS) methods heavily rely on
supervised labels during the search process, failing to handle ubiquitous
scenarios where supervisions are not available. In this paper, we study the
problem of unsupervised graph neural architecture search, which remains
unexplored in the literature. The key problem is to discover the latent graph
factors that drive the formation of graph data as well as the underlying
relations between the factors and the optimal neural architectures. Handling
this problem is challenging given that the latent graph factors together with
architectures are highly entangled due to the nature of the graph and the
complexity of the neural architecture search process. To address the challenge,
we propose a novel Disentangled Self-supervised Graph Neural Architecture
Search (DSGAS) model, which is able to discover the optimal architectures
capturing various latent graph factors in a self-supervised fashion based on
unlabeled graph data. Specifically, we first design a disentangled graph
super-network capable of incorporating multiple architectures with factor-wise
disentanglement, which are optimized simultaneously. Then, we estimate the
performance of architectures under different factors by our proposed
self-supervised training with joint architecture-graph disentanglement.
Finally, we propose a contrastive search with architecture augmentations to
discover architectures with factor-specific expertise. Extensive experiments on
11 real-world datasets demonstrate that the proposed model is able to achieve
state-of-the-art performance against several baseline methods in an
unsupervised manner.
Related papers
- Causal-Aware Graph Neural Architecture Search under Distribution Shifts [48.02254981004058]
Causal-aware Graph Neural Architecture Search (CARNAS) is able to capture the causal graph-architecture relationship during the architecture search process.
We propose Graph Embedding Intervention to intervene on causal subgraphs within the latent space.
arXiv Detail & Related papers (2024-05-26T08:55:22Z) - Multi-conditioned Graph Diffusion for Neural Architecture Search [8.290336491323796]
We present a graph diffusion-based NAS approach that uses discrete conditional graph diffusion processes to generate high-performing neural network architectures.
We show promising results on six standard benchmarks, yielding novel and unique architectures at a fast speed.
arXiv Detail & Related papers (2024-03-09T21:45:31Z) - Neural Architecture Retrieval [27.063268631346713]
We define a new problem Neural Architecture Retrieval which retrieves a set of existing neural architectures with similar designs to the query neural architecture.
Existing graph pre-training strategies cannot address the computational graph in neural architectures due to the graph size and motifs.
We introduce multi-level contrastive learning to achieve accurate graph representation learning.
arXiv Detail & Related papers (2023-07-16T01:56:41Z) - Neural combinatorial optimization beyond the TSP: Existing architectures
under-represent graph structure [9.673093148930876]
We analyze how and whether recent neural architectures can be applied to graph problems of practical importance.
We show that augmenting the structural representation of problems with Distance is a promising step towards the still-ambitious goal of learning multi-purpose autonomous solvers.
arXiv Detail & Related papers (2022-01-03T14:14:28Z) - Network Graph Based Neural Architecture Search [57.78724765340237]
We search neural network by rewiring the corresponding graph and predict the architecture performance by graph properties.
Because we do not perform machine learning over the entire graph space, the searching process is remarkably efficient.
arXiv Detail & Related papers (2021-12-15T00:12:03Z) - Edge-featured Graph Neural Architecture Search [131.4361207769865]
We propose Edge-featured Graph Neural Architecture Search to find the optimal GNN architecture.
Specifically, we design rich entity and edge updating operations to learn high-order representations.
We show EGNAS can search better GNNs with higher performance than current state-of-the-art human-designed and searched-based GNNs.
arXiv Detail & Related papers (2021-09-03T07:53:18Z) - Structural Landmarking and Interaction Modelling: on Resolution Dilemmas
in Graph Classification [50.83222170524406]
We study the intrinsic difficulty in graph classification under the unified concept of resolution dilemmas''
We propose SLIM'', an inductive neural network model for Structural Landmarking and Interaction Modelling.
arXiv Detail & Related papers (2020-06-29T01:01:42Z) - A Semi-Supervised Assessor of Neural Architectures [157.76189339451565]
We employ an auto-encoder to discover meaningful representations of neural architectures.
A graph convolutional neural network is introduced to predict the performance of architectures.
arXiv Detail & Related papers (2020-05-14T09:02:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.