Towards Job-Transition-Tag Graph for a Better Job Title Representation
Learning
- URL: http://arxiv.org/abs/2206.02782v1
- Date: Wed, 4 May 2022 12:11:31 GMT
- Title: Towards Job-Transition-Tag Graph for a Better Job Title Representation
Learning
- Authors: Jun Zhu and C\'eline Hudelot
- Abstract summary: We reformulate job title representation learning as the task of learning node embedding on the textitJob-Transition-Tag Graph.
Experiments on two datasets show the interest of our approach.
- Score: 28.571113863266916
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Works on learning job title representation are mainly based on
\textit{Job-Transition Graph}, built from the working history of talents.
However, since these records are usually messy, this graph is very sparse,
which affects the quality of the learned representation and hinders further
analysis. To address this specific issue, we propose to enrich the graph with
additional nodes that improve the quality of job title representation.
Specifically, we construct \textit{Job-Transition-Tag Graph}, a heterogeneous
graph containing two types of nodes, i.e., job titles and tags (i.e., words
related to job responsibilities or functionalities). Along this line, we
reformulate job title representation learning as the task of learning node
embedding on the \textit{Job-Transition-Tag Graph}. Experiments on two datasets
show the interest of our approach.
Related papers
- InstructG2I: Synthesizing Images from Multimodal Attributed Graphs [50.852150521561676]
We propose a graph context-conditioned diffusion model called InstructG2I.
InstructG2I first exploits the graph structure and multimodal information to conduct informative neighbor sampling.
A Graph-QFormer encoder adaptively encodes the graph nodes into an auxiliary set of graph prompts to guide the denoising process.
arXiv Detail & Related papers (2024-10-09T17:56:15Z) - One for All: Towards Training One Graph Model for All Classification Tasks [61.656962278497225]
A unified model for various graph tasks remains underexplored, primarily due to the challenges unique to the graph learning domain.
We propose textbfOne for All (OFA), the first general framework that can use a single graph model to address the above challenges.
OFA performs well across different tasks, making it the first general-purpose across-domains classification model on graphs.
arXiv Detail & Related papers (2023-09-29T21:15:26Z) - Semi-Supervised Hierarchical Graph Classification [54.25165160435073]
We study the node classification problem in the hierarchical graph where a 'node' is a graph instance.
We propose the Hierarchical Graph Mutual Information (HGMI) and present a way to compute HGMI with theoretical guarantee.
We demonstrate the effectiveness of this hierarchical graph modeling and the proposed SEAL-CI method on text and social network data.
arXiv Detail & Related papers (2022-06-11T04:05:29Z) - Exploring Graph Representation of Chorales [0.0]
This work explores overlapping areas music, graph theory, and machine learning.
An embedding representation of a node, in a weighted undirected graph $mathcalG$, is a representation that captures the meaning of nodes in an embedding space.
arXiv Detail & Related papers (2022-01-27T09:46:10Z) - Edge but not Least: Cross-View Graph Pooling [76.71497833616024]
This paper presents a cross-view graph pooling (Co-Pooling) method to better exploit crucial graph structure information.
Through cross-view interaction, edge-view pooling and node-view pooling seamlessly reinforce each other to learn more informative graph-level representations.
arXiv Detail & Related papers (2021-09-24T08:01:23Z) - Multi-Level Graph Contrastive Learning [38.022118893733804]
We propose a Multi-Level Graph Contrastive Learning (MLGCL) framework for learning robust representation of graph data by contrasting space views of graphs.
The original graph is first-order approximation structure and contains uncertainty or error, while the $k$NN graph generated by encoding features preserves high-order proximity.
Extensive experiments indicate MLGCL achieves promising results compared with the existing state-of-the-art graph representation learning methods on seven datasets.
arXiv Detail & Related papers (2021-07-06T14:24:43Z) - Edge Representation Learning with Hypergraphs [36.03482700241067]
We propose a novel edge representation learning framework based on Dual Hypergraph Transformation (DHT), which transforms the edges of a graph into the nodes of a hypergraph.
We validate our edge representation learning method with hypergraphs on diverse graph datasets for graph representation and generation performance.
Our edge representation learning and pooling method also largely outperforms state-of-the-art graph pooling methods on graph classification.
arXiv Detail & Related papers (2021-06-30T06:59:05Z) - Job2Vec: Job Title Benchmarking with Collective Multi-View
Representation Learning [51.34011135329063]
Job Title Benchmarking (JTB) aims at matching job titles with similar expertise levels across various companies.
Traditional JTB approaches mainly rely on manual market surveys, which is expensive and labor-intensive.
We reformulate the JTB as the task of link prediction over the Job-Graph that matched job titles should have links.
arXiv Detail & Related papers (2020-09-16T02:33:32Z) - Second-Order Pooling for Graph Neural Networks [62.13156203025818]
We propose to use second-order pooling as graph pooling, which naturally solves the above challenges.
We show that direct use of second-order pooling with graph neural networks leads to practical problems.
We propose two novel global graph pooling methods based on second-order pooling; namely, bilinear mapping and attentional second-order pooling.
arXiv Detail & Related papers (2020-07-20T20:52:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.