Efficient Variational Graph Autoencoders for Unsupervised Cross-domain
Prerequisite Chains
- URL: http://arxiv.org/abs/2109.08722v1
- Date: Fri, 17 Sep 2021 19:07:27 GMT
- Title: Efficient Variational Graph Autoencoders for Unsupervised Cross-domain
Prerequisite Chains
- Authors: Irene Li, Vanessa Yan and Dragomir Radev
- Abstract summary: We introduce Domain-versaational Variational Graph Autoencoders (DAVGAE) to solve this cross-domain prerequisite chain learning task efficiently.
Our novel model consists of a variational graph autoencoder (VGAE) and a domain discriminator.
Results show that our model outperforms recent graph-based computation using only 1/10 graph scale and 1/3 time.
- Score: 3.358838755118655
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Prerequisite chain learning helps people acquire new knowledge efficiently.
While people may quickly determine learning paths over concepts in a domain,
finding such paths in other domains can be challenging. We introduce
Domain-Adversarial Variational Graph Autoencoders (DAVGAE) to solve this
cross-domain prerequisite chain learning task efficiently. Our novel model
consists of a variational graph autoencoder (VGAE) and a domain discriminator.
The VGAE is trained to predict concept relations through link prediction, while
the domain discriminator takes both source and target domain data as input and
is trained to predict domain labels. Most importantly, this method only needs
simple homogeneous graphs as input, compared with the current state-of-the-art
model. We evaluate our model on the LectureBankCD dataset, and results show
that our model outperforms recent graph-based benchmarks while using only 1/10
of graph scale and 1/3 computation time.
Related papers
- Cross-domain Named Entity Recognition via Graph Matching [25.237288970802425]
Cross-domain NER is a practical yet challenging problem since the data scarcity in the real-world scenario.
We model the label relationship as a probability distribution and construct label graphs in both source and target label spaces.
By representing label relationships as graphs, we formulate cross-domain NER as a graph matching problem.
arXiv Detail & Related papers (2024-08-02T02:31:54Z) - Multi-label Image Classification using Adaptive Graph Convolutional Networks: from a Single Domain to Multiple Domains [8.02139126500224]
This paper proposes an adaptive graph-based approach for multi-label image classification.
It is done by integrating an attention-based mechanism and a similarity-preserving strategy.
The proposed framework is then extended to multiple domains using an adversarial training scheme.
arXiv Detail & Related papers (2023-01-11T14:42:47Z) - Adapting the Mean Teacher for keypoint-based lung registration under
geometric domain shifts [75.51482952586773]
deep neural networks generally require plenty of labeled training data and are vulnerable to domain shifts between training and test data.
We present a novel approach to geometric domain adaptation for image registration, adapting a model from a labeled source to an unlabeled target domain.
Our method consistently improves on the baseline model by 50%/47% while even matching the accuracy of models trained on target data.
arXiv Detail & Related papers (2022-07-01T12:16:42Z) - Similarity-aware Positive Instance Sampling for Graph Contrastive
Pre-training [82.68805025636165]
We propose to select positive graph instances directly from existing graphs in the training set.
Our selection is based on certain domain-specific pair-wise similarity measurements.
Besides, we develop an adaptive node-level pre-training method to dynamically mask nodes to distribute them evenly in the graph.
arXiv Detail & Related papers (2022-06-23T20:12:51Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Source Free Unsupervised Graph Domain Adaptation [60.901775859601685]
Unsupervised Graph Domain Adaptation (UGDA) shows its practical value of reducing the labeling cost for node classification.
Most existing UGDA methods heavily rely on the labeled graph in the source domain.
In some real-world scenarios, the source graph is inaccessible because of privacy issues.
We propose a novel scenario named Source Free Unsupervised Graph Domain Adaptation (SFUGDA)
arXiv Detail & Related papers (2021-12-02T03:18:18Z) - Unsupervised Cross-Domain Prerequisite Chain Learning using Variational
Graph Autoencoders [2.735701323590668]
We propose unsupervised cross-domain concept prerequisite chain learning using an optimized variational graph autoencoder.
Our model learns to transfer concept prerequisite relations from an information-rich domain to an information-poor domain.
Also, we expand an existing dataset by introducing two new domains: CV and Bioinformatics.
arXiv Detail & Related papers (2021-05-07T21:02:41Z) - Learning Domain-invariant Graph for Adaptive Semi-supervised Domain
Adaptation with Few Labeled Source Samples [65.55521019202557]
Domain adaptation aims to generalize a model from a source domain to tackle tasks in a related but different target domain.
Traditional domain adaptation algorithms assume that enough labeled data, which are treated as the prior knowledge are available in the source domain.
We propose a Domain-invariant Graph Learning (DGL) approach for domain adaptation with only a few labeled source samples.
arXiv Detail & Related papers (2020-08-21T08:13:25Z) - Unsupervised Intra-domain Adaptation for Semantic Segmentation through
Self-Supervision [73.76277367528657]
Convolutional neural network-based approaches have achieved remarkable progress in semantic segmentation.
To cope with this limitation, automatically annotated data generated from graphic engines are used to train segmentation models.
We propose a two-step self-supervised domain adaptation approach to minimize the inter-domain and intra-domain gap together.
arXiv Detail & Related papers (2020-04-16T15:24:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.