Exploiting Heterogeneous Graph Neural Networks with Latent Worker/Task
Correlation Information for Label Aggregation in Crowdsourcing
- URL: http://arxiv.org/abs/2010.13080v2
- Date: Thu, 13 May 2021 15:45:26 GMT
- Title: Exploiting Heterogeneous Graph Neural Networks with Latent Worker/Task
Correlation Information for Label Aggregation in Crowdsourcing
- Authors: Hanlu Wu, Tengfei Ma, Lingfei Wu, Shouling Ji
- Abstract summary: Crowdsourcing has attracted much attention for its convenience to collect labels from non-expert workers instead of experts.
We propose a novel framework based on graph neural networks for aggregating crowd labels.
- Score: 72.34616482076572
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Crowdsourcing has attracted much attention for its convenience to collect
labels from non-expert workers instead of experts. However, due to the high
level of noise from the non-experts, an aggregation model that learns the true
label by incorporating the source credibility is required. In this paper, we
propose a novel framework based on graph neural networks for aggregating crowd
labels. We construct a heterogeneous graph between workers and tasks and derive
a new graph neural network to learn the representations of nodes and the true
labels. Besides, we exploit the unknown latent interaction between the same
type of nodes (workers or tasks) by adding a homogeneous attention layer in the
graph neural networks. Experimental results on 13 real-world datasets show
superior performance over state-of-the-art models.
Related papers
- GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Learning on Graphs under Label Noise [5.909452203428086]
We develop a novel approach dubbed Consistent Graph Neural Network (CGNN) to solve the problem of learning on graphs with label noise.
Specifically, we employ graph contrastive learning as a regularization term, which promotes two views of augmented nodes to have consistent representations.
To detect noisy labels on the graph, we present a sample selection technique based on the homophily assumption.
arXiv Detail & Related papers (2023-06-14T01:38:01Z) - SMARTQUERY: An Active Learning Framework for Graph Neural Networks
through Hybrid Uncertainty Reduction [25.77052028238513]
We propose a framework to learn a graph neural network with very few labeled nodes using a hybrid uncertainty reduction function.
We demonstrate the competitive performance of our method against state-of-the-arts on very few labeled data.
arXiv Detail & Related papers (2022-12-02T20:49:38Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Robust Deep Learning from Crowds with Belief Propagation [6.643082745560235]
A graphical model representing local dependencies between workers and tasks provides a principled way of reasoning over the true labels from the noisy answers.
One needs a predictive model working on unseen data directly from crowdsourced datasets instead of the true labels in many cases.
We propose a new data-generating process, where a neural network generates the true labels from task features.
arXiv Detail & Related papers (2021-11-01T07:20:16Z) - Label Propagation across Graphs: Node Classification using Graph Neural
Tangent Kernels [12.445026956430826]
Graph neural networks (GNNs) have achieved superior performance on node classification tasks.
Our work considers a challenging inductive setting where a set of labeled graphs are available for training while the unlabeled target graph is completely separate.
Under the implicit assumption that the testing and training graphs come from similar distributions, our goal is to develop a labeling function that generalizes to unobserved connectivity structures.
arXiv Detail & Related papers (2021-10-07T19:42:35Z) - Temporal Graph Network Embedding with Causal Anonymous Walks
Representations [54.05212871508062]
We propose a novel approach for dynamic network representation learning based on Temporal Graph Network.
For evaluation, we provide a benchmark pipeline for the evaluation of temporal network embeddings.
We show the applicability and superior performance of our model in the real-world downstream graph machine learning task provided by one of the top European banks.
arXiv Detail & Related papers (2021-08-19T15:39:52Z) - Graph Decoupling Attention Markov Networks for Semi-supervised Graph
Node Classification [38.52231889960877]
Graph neural networks (GNN) have been ubiquitous in graph learning tasks such as node classification.
In this paper, we consider the label dependency of graph nodes and propose a decoupling attention mechanism to learn both hard and soft attention.
arXiv Detail & Related papers (2021-04-28T11:44:13Z) - Knowledge-Guided Multi-Label Few-Shot Learning for General Image
Recognition [75.44233392355711]
KGGR framework exploits prior knowledge of statistical label correlations with deep neural networks.
It first builds a structured knowledge graph to correlate different labels based on statistical label co-occurrence.
Then, it introduces the label semantics to guide learning semantic-specific features.
It exploits a graph propagation network to explore graph node interactions.
arXiv Detail & Related papers (2020-09-20T15:05:29Z) - Multilevel Graph Matching Networks for Deep Graph Similarity Learning [79.3213351477689]
We propose a multi-level graph matching network (MGMN) framework for computing the graph similarity between any pair of graph-structured objects.
To compensate for the lack of standard benchmark datasets, we have created and collected a set of datasets for both the graph-graph classification and graph-graph regression tasks.
Comprehensive experiments demonstrate that MGMN consistently outperforms state-of-the-art baseline models on both the graph-graph classification and graph-graph regression tasks.
arXiv Detail & Related papers (2020-07-08T19:48:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.