Neural Language Modeling for Contextualized Temporal Graph Generation
- URL: http://arxiv.org/abs/2010.10077v2
- Date: Mon, 12 Apr 2021 03:37:31 GMT
- Title: Neural Language Modeling for Contextualized Temporal Graph Generation
- Authors: Aman Madaan, Yiming Yang
- Abstract summary: This paper presents the first study on using large-scale pre-trained language models for automated generation of an event-level temporal graph for a document.
- Score: 49.21890450444187
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper presents the first study on using large-scale pre-trained language
models for automated generation of an event-level temporal graph for a
document. Despite the huge success of neural pre-training methods in NLP tasks,
its potential for temporal reasoning over event graphs has not been
sufficiently explored. Part of the reason is the difficulty in obtaining large
training corpora with human-annotated events and temporal links. We address
this challenge by using existing IE/NLP tools to automatically generate a large
quantity (89,000) of system-produced document-graph pairs, and propose a novel
formulation of the contextualized graph generation problem as a
sequence-to-sequence mapping task. These strategies enable us to leverage and
fine-tune pre-trained language models on the system-induced training data for
the graph generation task. Our experiments show that our approach is highly
effective in generating structurally and semantically valid graphs. Further,
evaluation on a challenging hand-labeled, out-domain corpus shows that our
method outperforms the closest existing method by a large margin on several
metrics. Code and pre-trained models are available at
https://github.com/madaan/temporal-graph-gen.
Related papers
- Dynamic and Textual Graph Generation Via Large-Scale LLM-based Agent Simulation [70.60461609393779]
GraphAgent-Generator (GAG) is a novel simulation-based framework for dynamic graph generation.
Our framework effectively replicates seven macro-level structural characteristics in established network science theories.
It supports generating graphs with up to nearly 100,000 nodes or 10 million edges, with a minimum speed-up of 90.4%.
arXiv Detail & Related papers (2024-10-13T12:57:08Z) - Parameter-Efficient Tuning Large Language Models for Graph Representation Learning [62.26278815157628]
We introduce Graph-aware.
Efficient Fine-Tuning - GPEFT, a novel approach for efficient graph representation learning.
We use a graph neural network (GNN) to encode structural information from neighboring nodes into a graph prompt.
We validate our approach through comprehensive experiments conducted on 8 different text-rich graphs, observing an average improvement of 2% in hit@1 and Mean Reciprocal Rank (MRR) in link prediction evaluations.
arXiv Detail & Related papers (2024-04-28T18:36:59Z) - SimTeG: A Frustratingly Simple Approach Improves Textual Graph Learning [131.04781590452308]
We present SimTeG, a frustratingly Simple approach for Textual Graph learning.
We first perform supervised parameter-efficient fine-tuning (PEFT) on a pre-trained LM on the downstream task.
We then generate node embeddings using the last hidden states of finetuned LM.
arXiv Detail & Related papers (2023-08-03T07:00:04Z) - GSHOT: Few-shot Generative Modeling of Labeled Graphs [44.94210194611249]
We introduce the hitherto unexplored paradigm of few-shot graph generative modeling.
We develop GSHOT, a framework for few-shot labeled graph generative modeling.
GSHOT adapts to an unseen graph dataset through self-paced fine-tuning.
arXiv Detail & Related papers (2023-06-06T08:03:18Z) - Explanation Graph Generation via Generative Pre-training over Synthetic
Graphs [6.25568933262682]
The generation of explanation graphs is a significant task that aims to produce explanation graphs in response to user input.
Current research commonly fine-tunes a text-based pre-trained language model on a small downstream dataset that is annotated with labeled graphs.
We propose a novel pre-trained framework EG3P(for Explanation Graph Generation via Generative Pre-training over synthetic graphs) for the explanation graph generation task.
arXiv Detail & Related papers (2023-06-01T13:20:22Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - TrackMPNN: A Message Passing Graph Neural Architecture for Multi-Object
Tracking [8.791710193028903]
This study follows many previous approaches to multi-object tracking (MOT) that model the problem using graph-based data structures.
We create a framework based on dynamic undirected graphs that represent the data association problem over multiple timesteps.
We also provide solutions and propositions for the computational problems that need to be addressed to create a memory-efficient, real-time, online algorithm.
arXiv Detail & Related papers (2021-01-11T21:52:25Z) - Promoting Graph Awareness in Linearized Graph-to-Text Generation [72.83863719868364]
We study the ability of linearized models to encode local graph structures.
Our findings motivate solutions to enrich the quality of models' implicit graph encodings.
We find that these denoising scaffolds lead to substantial improvements in downstream generation in low-resource settings.
arXiv Detail & Related papers (2020-12-31T18:17:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.