Boosting Source Code Learning with Data Augmentation: An Empirical Study
- URL: http://arxiv.org/abs/2303.06808v1
- Date: Mon, 13 Mar 2023 01:47:05 GMT
- Title: Boosting Source Code Learning with Data Augmentation: An Empirical Study
- Authors: Zeming Dong, Qiang Hu, Yuejun Guo, Zhenya Zhang, Maxime Cordy, Mike
Papadakis, Yves Le Traon, Jianjun Zhao
- Abstract summary: We study whether data augmentation methods originally used for text and graphs are effective in improving the training quality of source code learning.
Our results identify the data augmentation methods that can produce more accurate and robust models for source code learning.
- Score: 16.49710700412084
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The next era of program understanding is being propelled by the use of
machine learning to solve software problems. Recent studies have shown
surprising results of source code learning, which applies deep neural networks
(DNNs) to various critical software tasks, e.g., bug detection and clone
detection. This success can be greatly attributed to the utilization of massive
high-quality training data, and in practice, data augmentation, which is a
technique used to produce additional training data, has been widely adopted in
various domains, such as computer vision. However, in source code learning,
data augmentation has not been extensively studied, and existing practice is
limited to simple syntax-preserved methods, such as code refactoring.
Essentially, source code is often represented in two ways, namely, sequentially
as text data and structurally as graph data, when it is used as training data
in source code learning. Inspired by these analogy relations, we take an early
step to investigate whether data augmentation methods that are originally used
for text and graphs are effective in improving the training quality of source
code learning. To that end, we first collect and categorize data augmentation
methods in the literature. Second, we conduct a comprehensive empirical study
on four critical tasks and 11 DNN architectures to explore the effectiveness of
12 data augmentation methods (including code refactoring and 11 other methods
for text and graph data). Our results identify the data augmentation methods
that can produce more accurate and robust models for source code learning,
including those based on mixup (e.g., SenMixup for texts and Manifold-Mixup for
graphs), and those that slightly break the syntax of source code (e.g., random
swap and random deletion for texts).
Related papers
- Contextualized Data-Wrangling Code Generation in Computational Notebooks [131.26365849822932]
We propose an automated approach, CoCoMine, to mine data-wrangling code generation examples with clear multi-modal contextual dependency.
We construct CoCoNote, a dataset containing 58,221 examples for Contextualized Data-wrangling Code generation in Notebooks.
Experiment results demonstrate the significance of incorporating data context in data-wrangling code generation.
arXiv Detail & Related papers (2024-09-20T14:49:51Z) - Enhancing Source Code Representations for Deep Learning with Static
Analysis [10.222207222039048]
This paper explores the integration of static analysis and additional context such as bug reports and design patterns into source code representations for deep learning models.
We use the Abstract Syntax Tree-based Neural Network (ASTNN) method and augment it with additional context information obtained from bug reports and design patterns.
Our approach improves the representation and processing of source code, thereby improving task performance.
arXiv Detail & Related papers (2024-02-14T20:17:04Z) - Source Code Data Augmentation for Deep Learning: A Survey [32.035973285175075]
We conduct a comprehensive survey of data augmentation for source code.
We highlight the general strategies and techniques to optimize the DA quality.
We outline the prevailing challenges and potential opportunities for future research.
arXiv Detail & Related papers (2023-05-31T14:47:44Z) - Exploring Representation-Level Augmentation for Code Search [50.94201167562845]
We explore augmentation methods that augment data (both code and query) at representation level which does not require additional data processing and training.
We experimentally evaluate the proposed representation-level augmentation methods with state-of-the-art code search models on a large-scale public dataset.
arXiv Detail & Related papers (2022-10-21T22:47:37Z) - Enhancing Semantic Code Search with Multimodal Contrastive Learning and
Soft Data Augmentation [50.14232079160476]
We propose a new approach with multimodal contrastive learning and soft data augmentation for code search.
We conduct extensive experiments to evaluate the effectiveness of our approach on a large-scale dataset with six programming languages.
arXiv Detail & Related papers (2022-04-07T08:49:27Z) - Lexically Aware Semi-Supervised Learning for OCR Post-Correction [90.54336622024299]
Much of the existing linguistic data in many languages of the world is locked away in non-digitized books and documents.
Previous work has demonstrated the utility of neural post-correction methods on recognition of less-well-resourced languages.
We present a semi-supervised learning method that makes it possible to utilize raw images to improve performance.
arXiv Detail & Related papers (2021-11-04T04:39:02Z) - Improved Code Summarization via a Graph Neural Network [96.03715569092523]
In general, source code summarization techniques use the source code as input and outputs a natural language description.
We present an approach that uses a graph-based neural architecture that better matches the default structure of the AST to generate these summaries.
arXiv Detail & Related papers (2020-04-06T17:36:42Z) - Auto-Encoding Twin-Bottleneck Hashing [141.5378966676885]
This paper proposes an efficient and adaptive code-driven graph.
It is updated by decoding in the context of an auto-encoder.
Experiments on benchmarked datasets clearly show the superiority of our framework over the state-of-the-art hashing methods.
arXiv Detail & Related papers (2020-02-27T05:58:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.