A Survey on Natural Language Processing for Programming
- URL: http://arxiv.org/abs/2212.05773v2
- Date: Sun, 6 Aug 2023 02:10:07 GMT
- Title: A Survey on Natural Language Processing for Programming
- Authors: Qingfu Zhu, Xianzhen Luo, Fang Liu, Cuiyun Gao, Wanxiang Che
- Abstract summary: Natural language processing for programming aims to use NLP techniques to assist programming.
Structure-based representation and functionality-oriented algorithm are at the heart of program understanding and generation.
- Score: 42.850340313115765
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Natural language processing for programming aims to use NLP techniques to
assist programming. It is increasingly prevalent for its effectiveness in
improving productivity. Distinct from natural language, a programming language
is highly structured and functional. Constructing a structure-based
representation and a functionality-oriented algorithm is at the heart of
program understanding and generation. In this paper, we conduct a systematic
review covering tasks, datasets, evaluation methods, techniques, and models
from the perspective of the structure-based and functionality-oriented
property, aiming to understand the role of the two properties in each
component. Based on the analysis, we illustrate unexplored areas and suggest
potential directions for future work.
Related papers
- Deep Learning and Machine Learning, Advancing Big Data Analytics and Management: Object-Oriented Programming [17.62426370778165]
Object-Oriented Programming (OOP) has become a crucial paradigm for managing the growing complexity of modern software systems.
This work provides a comprehensive introduction to the integration of OOP techniques within these domains.
We examine how design patterns and modular programming can be employed to enhance the structure and efficiency of machine learning systems.
arXiv Detail & Related papers (2024-09-30T03:37:10Z) - Natural Language Processing for Requirements Traceability [47.93107382627423]
Traceability plays a crucial role in requirements and software engineering, particularly for safety-critical systems.
Natural language processing (NLP) and related techniques have made considerable progress in the past decade.
arXiv Detail & Related papers (2024-05-17T15:17:00Z) - Engineering A Large Language Model From Scratch [0.0]
Atinuke is a Transformer-based neural network that optimises performance across various language tasks.
It can emulate human-like language by extracting features and learning complex mappings.
System achieves state-of-the-art results on natural language tasks whilst remaining interpretable and robust.
arXiv Detail & Related papers (2024-01-30T04:29:48Z) - Exploring Large Language Model for Graph Data Understanding in Online
Job Recommendations [63.19448893196642]
We present a novel framework that harnesses the rich contextual information and semantic representations provided by large language models to analyze behavior graphs.
By leveraging this capability, our framework enables personalized and accurate job recommendations for individual users.
arXiv Detail & Related papers (2023-07-10T11:29:41Z) - Interactive Natural Language Processing [67.87925315773924]
Interactive Natural Language Processing (iNLP) has emerged as a novel paradigm within the field of NLP.
This paper offers a comprehensive survey of iNLP, starting by proposing a unified definition and framework of the concept.
arXiv Detail & Related papers (2023-05-22T17:18:29Z) - Leveraging Language to Learn Program Abstractions and Search Heuristics [66.28391181268645]
We introduce LAPS (Language for Abstraction and Program Search), a technique for using natural language annotations to guide joint learning of libraries and neurally-guided search models for synthesis.
When integrated into a state-of-the-art library learning system (DreamCoder), LAPS produces higher-quality libraries and improves search efficiency and generalization.
arXiv Detail & Related papers (2021-06-18T15:08:47Z) - How could Neural Networks understand Programs? [67.4217527949013]
It is difficult to build a model to better understand programs, by either directly applying off-the-shelf NLP pre-training techniques to the source code, or adding features to the model by theshelf.
We propose a novel program semantics learning paradigm, that the model should learn from information composed of (1) the representations which align well with the fundamental operations in operational semantics, and (2) the information of environment transition.
arXiv Detail & Related papers (2021-05-10T12:21:42Z) - Natural Language Processing Advancements By Deep Learning: A Survey [0.755972004983746]
This survey categorizes and addresses the different aspects and applications of NLP that have benefited from deep learning.
It covers core NLP tasks and applications and describes how deep learning methods and models advance these areas.
arXiv Detail & Related papers (2020-03-02T21:32:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.