Exploiting Method Names to Improve Code Summarization: A Deliberation
Multi-Task Learning Approach
- URL: http://arxiv.org/abs/2103.11448v1
- Date: Sun, 21 Mar 2021 17:52:21 GMT
- Title: Exploiting Method Names to Improve Code Summarization: A Deliberation
Multi-Task Learning Approach
- Authors: Rui Xie, Wei Ye, Jinan Sun, Shikun Zhang
- Abstract summary: We design a novel multi-task learning (MTL) approach for code summarization.
We first introduce the tasks of generation and informativeness prediction of method names.
A novel two-pass deliberation mechanism is then incorporated into our MTL architecture to generate more consistent intermediate states.
- Score: 5.577102440028882
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Code summaries are brief natural language descriptions of source code pieces.
The main purpose of code summarization is to assist developers in understanding
code and to reduce documentation workload. In this paper, we design a novel
multi-task learning (MTL) approach for code summarization through mining the
relationship between method code summaries and method names. More specifically,
since a method's name can be considered as a shorter version of its code
summary, we first introduce the tasks of generation and informativeness
prediction of method names as two auxiliary training objectives for code
summarization. A novel two-pass deliberation mechanism is then incorporated
into our MTL architecture to generate more consistent intermediate states fed
into a summary decoder, especially when informative method names do not exist.
To evaluate our deliberation MTL approach, we carried out a large-scale
experiment on two existing datasets for Java and Python. The experiment results
show that our technique can be easily applied to many state-of-the-art neural
models for code summarization and improve their performance. Meanwhile, our
approach shows significant superiority when generating summaries for methods
with non-informative names.
Related papers
- ESALE: Enhancing Code-Summary Alignment Learning for Source Code Summarization [21.886950861445122]
Code summarization aims to automatically generate succinct natural language summaries for given code snippets.
This paper proposes a novel approach to improve code summarization based on summary-focused tasks.
arXiv Detail & Related papers (2024-07-01T03:06:51Z) - SparseCoder: Identifier-Aware Sparse Transformer for File-Level Code
Summarization [51.67317895094664]
This paper studies file-level code summarization, which can assist programmers in understanding and maintaining large source code projects.
We propose SparseCoder, an identifier-aware sparse transformer for effectively handling long code sequences.
arXiv Detail & Related papers (2024-01-26T09:23:27Z) - Understanding Code Semantics: An Evaluation of Transformer Models in
Summarization [0.0]
We evaluate the efficacy of code summarization by altering function and variable names.
We introduce adversaries like dead code and commented code across three programming languages.
arXiv Detail & Related papers (2023-10-25T02:41:50Z) - Soft-Labeled Contrastive Pre-training for Function-level Code
Representation [127.71430696347174]
We present textbfSCodeR, a textbfSoft-labeled contrastive pre-training framework with two positive sample construction methods.
Considering the relevance between codes in a large-scale code corpus, the soft-labeled contrastive pre-training can obtain fine-grained soft-labels.
SCodeR achieves new state-of-the-art performance on four code-related tasks over seven datasets.
arXiv Detail & Related papers (2022-10-18T05:17:37Z) - An Extractive-and-Abstractive Framework for Source Code Summarization [28.553366270065656]
Code summarization aims to automatically generate summaries/comments for a given code snippet in the form of natural language.
We propose a novel extractive-and-abstractive framework to generate human-written-like summaries with preserved factual details.
arXiv Detail & Related papers (2022-06-15T02:14:24Z) - Enhancing Semantic Code Search with Multimodal Contrastive Learning and
Soft Data Augmentation [50.14232079160476]
We propose a new approach with multimodal contrastive learning and soft data augmentation for code search.
We conduct extensive experiments to evaluate the effectiveness of our approach on a large-scale dataset with six programming languages.
arXiv Detail & Related papers (2022-04-07T08:49:27Z) - CodeRetriever: Unimodal and Bimodal Contrastive Learning [128.06072658302165]
We propose the CodeRetriever model, which combines the unimodal and bimodal contrastive learning to train function-level code semantic representations.
For unimodal contrastive learning, we design a semantic-guided method to build positive code pairs based on the documentation and function name.
For bimodal contrastive learning, we leverage the documentation and in-line comments of code to build text-code pairs.
arXiv Detail & Related papers (2022-01-26T10:54:30Z) - Leveraging Unsupervised Learning to Summarize APIs Discussed in Stack
Overflow [1.8047694351309207]
This paper proposes an automatic and novel approach for summarizing Android API methods discussed in Stack Overflow.
Our approach takes the API method's name as an input and generates a natural language summary based on Stack Overflow discussions of that API method.
We have conducted a survey that involves 16 Android developers to evaluate the quality of our automatically generated summaries and compare them with the official Android documentation.
arXiv Detail & Related papers (2021-11-27T18:49:51Z) - Knowledge-Aware Procedural Text Understanding with Multi-Stage Training [110.93934567725826]
We focus on the task of procedural text understanding, which aims to comprehend such documents and track entities' states and locations during a process.
Two challenges, the difficulty of commonsense reasoning and data insufficiency, still remain unsolved.
We propose a novel KnOwledge-Aware proceduraL text understAnding (KOALA) model, which effectively leverages multiple forms of external knowledge.
arXiv Detail & Related papers (2020-09-28T10:28:40Z) - A Transformer-based Approach for Source Code Summarization [86.08359401867577]
We learn code representation for summarization by modeling the pairwise relationship between code tokens.
We show that despite the approach is simple, it outperforms the state-of-the-art techniques by a significant margin.
arXiv Detail & Related papers (2020-05-01T23:29:36Z) - Leveraging Code Generation to Improve Code Retrieval and Summarization
via Dual Learning [18.354352985591305]
Code summarization generates brief natural language description given a source code snippet, while code retrieval fetches relevant source code given a natural language query.
Recent studies have combined these two tasks to improve their performance.
We propose a novel end-to-end model for the two tasks by introducing an additional code generation task.
arXiv Detail & Related papers (2020-02-24T12:26:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.