Towards Understanding and Analyzing Rationale in Commit Messages using a
Knowledge Graph Approach
- URL: http://arxiv.org/abs/2311.03358v1
- Date: Mon, 4 Sep 2023 13:28:18 GMT
- Title: Towards Understanding and Analyzing Rationale in Commit Messages using a
Knowledge Graph Approach
- Authors: Mouna Dhaouadi, Bentley James Oakes, Michalis Famelis
- Abstract summary: We present our ongoing work on the Kantara end-to-end rationale reconstruction pipeline.
We also present our work on creating a labelled dataset for our running example of the Out-of-Memory component of the Linux kernel.
- Score: 1.450261153230204
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Extracting rationale information from commit messages allows developers to
better understand a system and its past development. Here we present our
ongoing work on the Kantara end-to-end rationale reconstruction pipeline to a)
structure rationale information in an ontologically-based knowledge graph, b)
extract and classify this information from commits, and c) produce analysis
reports and visualizations for developers. We also present our work on creating
a labelled dataset for our running example of the Out-of-Memory component of
the Linux kernel. This dataset is used as ground truth for our evaluation of
NLP classification techniques which show promising results, especially the
multi-classification technique XGBoost.
Related papers
- Automated Extraction and Creation of FBS Design Reasoning Knowledge Graphs from Structured Data in Product Catalogues Lacking Contextual Information [0.10840985826142427]
Ontology-based knowledge graphs (KG) are desirable for effective knowledge management and reuse in various decision making scenarios.
Most research on automated extraction and creation of KG is based on extensive unstructured data sets.
This research reports a method and digital workflow developed to address this gap.
arXiv Detail & Related papers (2024-12-08T09:20:25Z) - Relational Graph Convolutional Networks for Sentiment Analysis [0.0]
Graph Convolutional Networks (NRGCs) offer interpretability and flexibility by capturing dependencies between data points represented as nodes in a graph.
We demonstrate the effectiveness of our approach by using pre-trained language models such as BERT and RoBERTa with RGCN architecture on product reviews from Amazon and Digikala datasets.
arXiv Detail & Related papers (2024-04-16T07:27:49Z) - Contextualization Distillation from Large Language Model for Knowledge
Graph Completion [51.126166442122546]
We introduce the Contextualization Distillation strategy, a plug-in-and-play approach compatible with both discriminative and generative KGC frameworks.
Our method begins by instructing large language models to transform compact, structural triplets into context-rich segments.
Comprehensive evaluations across diverse datasets and KGC techniques highlight the efficacy and adaptability of our approach.
arXiv Detail & Related papers (2024-01-28T08:56:49Z) - Knowledge Graphs and Pre-trained Language Models enhanced Representation Learning for Conversational Recommender Systems [58.561904356651276]
We introduce the Knowledge-Enhanced Entity Representation Learning (KERL) framework to improve the semantic understanding of entities for Conversational recommender systems.
KERL uses a knowledge graph and a pre-trained language model to improve the semantic understanding of entities.
KERL achieves state-of-the-art results in both recommendation and response generation tasks.
arXiv Detail & Related papers (2023-12-18T06:41:23Z) - CADGE: Context-Aware Dialogue Generation Enhanced with Graph-Structured Knowledge Aggregation [25.56539617837482]
A novel context-aware graph-attention model (Context-aware GAT) is proposed.
It assimilates global features from relevant knowledge graphs through a context-enhanced knowledge aggregation mechanism.
Empirical results demonstrate that our framework outperforms conventional GNN-based language models in terms of performance.
arXiv Detail & Related papers (2023-05-10T16:31:35Z) - Schema-aware Reference as Prompt Improves Data-Efficient Knowledge Graph
Construction [57.854498238624366]
We propose a retrieval-augmented approach, which retrieves schema-aware Reference As Prompt (RAP) for data-efficient knowledge graph construction.
RAP can dynamically leverage schema and knowledge inherited from human-annotated and weak-supervised data as a prompt for each sample.
arXiv Detail & Related papers (2022-10-19T16:40:28Z) - Knowledge Graph Augmented Network Towards Multiview Representation
Learning for Aspect-based Sentiment Analysis [96.53859361560505]
We propose a knowledge graph augmented network (KGAN) to incorporate external knowledge with explicitly syntactic and contextual information.
KGAN captures the sentiment feature representations from multiple perspectives, i.e., context-, syntax- and knowledge-based.
Experiments on three popular ABSA benchmarks demonstrate the effectiveness and robustness of our KGAN.
arXiv Detail & Related papers (2022-01-13T08:25:53Z) - ENT-DESC: Entity Description Generation by Exploring Knowledge Graph [53.03778194567752]
In practice, the input knowledge could be more than enough, since the output description may only cover the most significant knowledge.
We introduce a large-scale and challenging dataset to facilitate the study of such a practical scenario in KG-to-text.
We propose a multi-graph structure that is able to represent the original graph information more comprehensively.
arXiv Detail & Related papers (2020-04-30T14:16:19Z) - Exploiting Structured Knowledge in Text via Graph-Guided Representation
Learning [73.0598186896953]
We present two self-supervised tasks learning over raw text with the guidance from knowledge graphs.
Building upon entity-level masked language models, our first contribution is an entity masking scheme.
In contrast to existing paradigms, our approach uses knowledge graphs implicitly, only during pre-training.
arXiv Detail & Related papers (2020-04-29T14:22:42Z) - Incorporating Joint Embeddings into Goal-Oriented Dialogues with
Multi-Task Learning [8.662586355051014]
We propose an RNN-based end-to-end encoder-decoder architecture which is trained with joint embeddings of the knowledge graph and the corpus as input.
The model provides an additional integration of user intent along with text generation, trained with a multi-task learning paradigm.
arXiv Detail & Related papers (2020-01-28T17:15:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.