Learning Deep Semantic Model for Code Search using CodeSearchNet Corpus
- URL: http://arxiv.org/abs/2201.11313v1
- Date: Thu, 27 Jan 2022 04:15:59 GMT
- Title: Learning Deep Semantic Model for Code Search using CodeSearchNet Corpus
- Authors: Chen Wu and Ming Yan
- Abstract summary: We propose a novel deep semantic model which makes use of the utilities of multi-modal sources.
We apply the proposed model to tackle the CodeSearchNet challenge about semantic code search.
Our model is trained on CodeSearchNet corpus and evaluated on the held-out data, the final model achieves 0.384 NDCG and won the first place in this benchmark.
- Score: 17.6095840480926
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Semantic code search is the task of retrieving relevant code snippet given a
natural language query. Different from typical information retrieval tasks,
code search requires to bridge the semantic gap between the programming
language and natural language, for better describing intrinsic concepts and
semantics. Recently, deep neural network for code search has been a hot
research topic. Typical methods for neural code search first represent the code
snippet and query text as separate embeddings, and then use vector distance
(e.g. dot-product or cosine) to calculate the semantic similarity between them.
There exist many different ways for aggregating the variable length of code or
query tokens into a learnable embedding, including bi-encoder, cross-encoder,
and poly-encoder. The goal of the query encoder and code encoder is to produce
embeddings that are close with each other for a related pair of query and the
corresponding desired code snippet, in which the choice and design of encoder
is very significant.
In this paper, we propose a novel deep semantic model which makes use of the
utilities of not only the multi-modal sources, but also feature extractors such
as self-attention, the aggregated vectors, combination of the intermediate
representations. We apply the proposed model to tackle the CodeSearchNet
challenge about semantic code search. We align cross-lingual embedding for
multi-modality learning with large batches and hard example mining, and combine
different learned representations for better enhancing the representation
learning. Our model is trained on CodeSearchNet corpus and evaluated on the
held-out data, the final model achieves 0.384 NDCG and won the first place in
this benchmark. Models and code are available at
https://github.com/overwindows/SemanticCodeSearch.git.
Related papers
- SparseCoder: Identifier-Aware Sparse Transformer for File-Level Code
Summarization [51.67317895094664]
This paper studies file-level code summarization, which can assist programmers in understanding and maintaining large source code projects.
We propose SparseCoder, an identifier-aware sparse transformer for effectively handling long code sequences.
arXiv Detail & Related papers (2024-01-26T09:23:27Z) - Survey of Code Search Based on Deep Learning [11.94599964179766]
This survey focuses on code search, that is, to retrieve code that matches a given query.
Deep learning, being able to extract complex semantics information, has achieved great success in this field.
We propose a new taxonomy to illustrate the state-of-the-art deep learning-based code search.
arXiv Detail & Related papers (2023-05-10T08:07:04Z) - Generation-Augmented Query Expansion For Code Retrieval [51.20943646688115]
We propose a generation-augmented query expansion framework.
Inspired by the human retrieval process - sketching an answer before searching.
We achieve new state-of-the-art results on the CodeSearchNet benchmark.
arXiv Detail & Related papers (2022-12-20T23:49:37Z) - Enhancing Semantic Code Search with Multimodal Contrastive Learning and
Soft Data Augmentation [50.14232079160476]
We propose a new approach with multimodal contrastive learning and soft data augmentation for code search.
We conduct extensive experiments to evaluate the effectiveness of our approach on a large-scale dataset with six programming languages.
arXiv Detail & Related papers (2022-04-07T08:49:27Z) - CodeRetriever: Unimodal and Bimodal Contrastive Learning [128.06072658302165]
We propose the CodeRetriever model, which combines the unimodal and bimodal contrastive learning to train function-level code semantic representations.
For unimodal contrastive learning, we design a semantic-guided method to build positive code pairs based on the documentation and function name.
For bimodal contrastive learning, we leverage the documentation and in-line comments of code to build text-code pairs.
arXiv Detail & Related papers (2022-01-26T10:54:30Z) - Multimodal Representation for Neural Code Search [18.371048875103497]
We introduce tree-serialization methods on a simplified form of AST and build the multimodal representation for the code data.
Our results show that both our tree-serialized representations and multimodal learning model improve the performance of neural code search.
arXiv Detail & Related papers (2021-07-02T12:08:19Z) - BERT2Code: Can Pretrained Language Models be Leveraged for Code Search? [0.7953229555481884]
We show that our model learns the inherent relationship between the embedding spaces and further probes into the scope of improvement.
In this analysis, we show that the quality of the code embedding model is the bottleneck for our model's performance.
arXiv Detail & Related papers (2021-04-16T10:28:27Z) - Deep Graph Matching and Searching for Semantic Code Retrieval [76.51445515611469]
We propose an end-to-end deep graph matching and searching model based on graph neural networks.
We first represent both natural language query texts and programming language code snippets with the unified graph-structured data.
In particular, DGMS not only captures more structural information for individual query texts or code snippets but also learns the fine-grained similarity between them.
arXiv Detail & Related papers (2020-10-24T14:16:50Z) - COSEA: Convolutional Code Search with Layer-wise Attention [90.35777733464354]
We propose a new deep learning architecture, COSEA, which leverages convolutional neural networks with layer-wise attention to capture the code's intrinsic structural logic.
COSEA can achieve significant improvements over state-of-the-art methods on code search tasks.
arXiv Detail & Related papers (2020-10-19T13:53:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.