LambdaNet: Probabilistic Type Inference using Graph Neural Networks
- URL: http://arxiv.org/abs/2005.02161v1
- Date: Wed, 29 Apr 2020 17:48:40 GMT
- Title: LambdaNet: Probabilistic Type Inference using Graph Neural Networks
- Authors: Jiayi Wei, Maruth Goyal, Greg Durrett, Isil Dillig
- Abstract summary: This paper proposes a probabilistic type inference scheme for TypeScript based on a graph neural network.
Our approach can predict both standard types, like number or string, as well as user-defined types that have not been encountered during training.
- Score: 46.66093127573704
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: As gradual typing becomes increasingly popular in languages like Python and
TypeScript, there is a growing need to infer type annotations automatically.
While type annotations help with tasks like code completion and static error
catching, these annotations cannot be fully determined by compilers and are
tedious to annotate by hand. This paper proposes a probabilistic type inference
scheme for TypeScript based on a graph neural network. Our approach first uses
lightweight source code analysis to generate a program abstraction called a
type dependency graph, which links type variables with logical constraints as
well as name and usage information. Given this program abstraction, we then use
a graph neural network to propagate information between related type variables
and eventually make type predictions. Our neural architecture can predict both
standard types, like number or string, as well as user-defined types that have
not been encountered during training. Our experimental results show that our
approach outperforms prior work in this space by $14\%$ (absolute) on library
types, while having the ability to make type predictions that are out of scope
for existing techniques.
Related papers
- Generative Type Inference for Python [62.01560866916557]
This paper introduces TypeGen, a few-shot generative type inference approach that incorporates static domain knowledge from static analysis.
TypeGen creates chain-of-thought (COT) prompts by translating the type inference steps of static analysis into prompts based on the type dependency graphs (TDGs)
Experiments show that TypeGen outperforms the best baseline Type4Py by 10.0% for argument type prediction and 22.5% in return value type prediction in terms of top-1 Exact Match.
arXiv Detail & Related papers (2023-07-18T11:40:31Z) - TypeT5: Seq2seq Type Inference using Static Analysis [51.153089609654174]
We present a new type inference method that treats type prediction as a code infilling task.
Our method uses static analysis to construct dynamic contexts for each code element whose type signature is to be predicted by the model.
We also propose an iterative decoding scheme that incorporates previous type predictions in the model's input context.
arXiv Detail & Related papers (2023-03-16T23:48:00Z) - Modeling Label Correlations for Ultra-Fine Entity Typing with Neural
Pairwise Conditional Random Field [47.22366788848256]
We use an undirected graphical model called pairwise conditional random field (PCRF) to formulate the UFET problem.
We use various modern backbones for entity typing to compute unary potentials and derive pairwise potentials from type phrase representations.
We use mean-field variational inference for efficient type inference on very large type sets and unfold it as a neural network module to enable end-to-end training.
arXiv Detail & Related papers (2022-12-03T09:49:15Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Type4Py: Deep Similarity Learning-Based Type Inference for Python [9.956021565144662]
We present Type4Py, a deep similarity learning-based type inference model for Python.
We design a hierarchical neural network model that learns to discriminate between types of the same kind and dissimilar types in a high-dimensional space.
Considering the Top-1 prediction, Type4Py obtains 19.33% and 13.49% higher precision than Typilus and TypeWriter, respectively.
arXiv Detail & Related papers (2021-01-12T13:32:53Z) - Learning to map source code to software vulnerability using
code-as-a-graph [67.62847721118142]
We explore the applicability of Graph Neural Networks in learning the nuances of source code from a security perspective.
We show that a code-as-graph encoding is more meaningful for vulnerability detection than existing code-as-photo and linear sequence encoding approaches.
arXiv Detail & Related papers (2020-06-15T16:05:27Z) - Typilus: Neural Type Hints [17.332608142043004]
We present a graph neural network model that predicts types by probabilistically reasoning over a program's structure, names, and patterns.
Our model can employ one-shot learning to predict an open vocabulary of types, including rare and user-defined ones.
We show that Typilus confidently predicts types for 70% of all annotatable symbols.
arXiv Detail & Related papers (2020-04-06T11:14:03Z) - OptTyper: Probabilistic Type Inference by Optimising Logical and Natural
Constraints [26.80183744947193]
We introduce a framework for probabilistic type inference that combines logic and learning.
We build a tool called OptTyper to predict missing types for TypeScript files.
arXiv Detail & Related papers (2020-04-01T11:32:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.