WasmWalker: Path-based Code Representations for Improved WebAssembly Program Analysis
- URL: http://arxiv.org/abs/2410.08517v1
- Date: Fri, 11 Oct 2024 04:35:34 GMT
- Title: WasmWalker: Path-based Code Representations for Improved WebAssembly Program Analysis
- Authors: Mohammad Robati Shirzad, Patrick Lam,
- Abstract summary: WebAssembly, or Wasm, is a low-level binary language that enables execution of near-native-performance code in web browsers.
We propose two novel code representations for WebAssembly binaries.
These novel representations serve not only to generate fixed-size code embeddings but also to supply additional information to sequence-to-sequence models.
- Score: 0.6445605125467572
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: WebAssembly, or Wasm, is a low-level binary language that enables execution of near-native-performance code in web browsers. Wasm has proven to be useful in applications including gaming, audio and video processing, and cloud computing, providing a high-performance, low-overhead alternative to JavaScript in web development. The fast and widespread adoption of WebAssembly by all major browsers has created an opportunity for analysis tools that support this new technology. Deep learning program analysis models can greatly benefit from the program structure information included in Abstract Syntax Tree (AST)-aware code representations. To obtain such code representations, we performed an empirical analysis on the AST paths in the WebAssembly Text format of a large dataset of WebAssembly binary files compiled from source packages in the Ubuntu 18.04 repositories. After refining the collected paths, we discovered that only 3,352 unique paths appeared across all of these binary files. With this insight, we propose two novel code representations for WebAssembly binaries. These novel representations serve not only to generate fixed-size code embeddings but also to supply additional information to sequence-to-sequence models. Ultimately, our approach helps program analysis models uncover new properties from Wasm binaries, expanding our understanding of their potential. We evaluated our new code representation on two applications: (i) method name prediction and (ii) recovering precise return types. Our results demonstrate the superiority of our novel technique over previous methods. More specifically, our new method resulted in 5.36% (11.31%) improvement in Top-1 (Top-5) accuracy in method name prediction and 8.02% (7.92%) improvement in recovering precise return types, compared to the previous state-of-the-art technique, SnowWhite.
Related papers
- SuperCoder2.0: Technical Report on Exploring the feasibility of LLMs as Autonomous Programmer [0.0]
SuperCoder2.0 is an advanced autonomous system designed to enhance software development through artificial intelligence.
System combines an AI-native development approach with intelligent agents to enable fully autonomous coding.
arXiv Detail & Related papers (2024-09-17T13:44:42Z) - Boosting Few-shot 3D Point Cloud Segmentation via Query-Guided
Enhancement [30.017448714419455]
This paper proposes a novel approach to improve point cloud few-shot segmentation (PC-FSS) models.
Unlike existing PC-FSS methods that directly utilize categorical information from support prototypes to recognize novel classes in query samples, our method identifies two critical aspects that substantially enhance model performance.
arXiv Detail & Related papers (2023-08-06T18:07:45Z) - Neural Transition-based Parsing of Library Deprecations [3.6382354548339295]
This paper tackles the problem of automating code updates to fix deprecated API usages of open source libraries by analyzing their release notes.
Our system employs a three-tier architecture: first, a web crawler service retrieves deprecation documentation from the web; then a specially built text processes those documents into tree-structured representations.
To confirm the effectiveness of our method, we gathered and labeled a set of 426 API deprecations from 7 well-known Python data science libraries, and demonstrated our approach decisively outperforms a non-trivial neural machine translation baseline.
arXiv Detail & Related papers (2022-12-23T20:48:33Z) - UniASM: Binary Code Similarity Detection without Fine-tuning [0.8271859911016718]
We propose a novel transformer-based binary code embedding model named UniASM to learn representations of the binary functions.
In the real-world task of known vulnerability search, UniASM outperforms all the current baselines.
arXiv Detail & Related papers (2022-10-28T14:04:57Z) - Exploring Representation-Level Augmentation for Code Search [50.94201167562845]
We explore augmentation methods that augment data (both code and query) at representation level which does not require additional data processing and training.
We experimentally evaluate the proposed representation-level augmentation methods with state-of-the-art code search models on a large-scale public dataset.
arXiv Detail & Related papers (2022-10-21T22:47:37Z) - UnifieR: A Unified Retriever for Large-Scale Retrieval [84.61239936314597]
Large-scale retrieval is to recall relevant documents from a huge collection given a query.
Recent retrieval methods based on pre-trained language models (PLM) can be coarsely categorized into either dense-vector or lexicon-based paradigms.
We propose a new learning framework, UnifieR which unifies dense-vector and lexicon-based retrieval in one model with a dual-representing capability.
arXiv Detail & Related papers (2022-05-23T11:01:59Z) - Autoregressive Search Engines: Generating Substrings as Document
Identifiers [53.0729058170278]
Autoregressive language models are emerging as the de-facto standard for generating answers.
Previous work has explored ways to partition the search space into hierarchical structures.
In this work we propose an alternative that doesn't force any structure in the search space: using all ngrams in a passage as its possible identifiers.
arXiv Detail & Related papers (2022-04-22T10:45:01Z) - Semantic-aware Binary Code Representation with BERT [27.908093567605484]
A wide range of binary analysis applications, such as bug discovery, malware analysis and code clone detection, require recovery of contextual meanings on a binary code.
Recently, binary analysis techniques based on machine learning have been proposed to automatically reconstruct the code representation of a binary.
In this paper, we propose DeepSemantic utilizing BERT in producing the semantic-aware code representation of a binary code.
arXiv Detail & Related papers (2021-06-10T03:31:29Z) - BiO-Net: Learning Recurrent Bi-directional Connections for
Encoder-Decoder Architecture [82.64881585566825]
We present a novel Bi-directional O-shape network (BiO-Net) that reuses the building blocks in a recurrent manner without introducing any extra parameters.
Our method significantly outperforms the vanilla U-Net as well as other state-of-the-art methods.
arXiv Detail & Related papers (2020-07-01T05:07:49Z) - Improved Code Summarization via a Graph Neural Network [96.03715569092523]
In general, source code summarization techniques use the source code as input and outputs a natural language description.
We present an approach that uses a graph-based neural architecture that better matches the default structure of the AST to generate these summaries.
arXiv Detail & Related papers (2020-04-06T17:36:42Z) - BATS: Binary ArchitecTure Search [56.87581500474093]
We show that directly applying Neural Architecture Search to the binary domain provides very poor results.
Specifically, we introduce and design a novel binary-oriented search space.
We also set a new state-of-the-art for binary neural networks on CIFAR10, CIFAR100 and ImageNet datasets.
arXiv Detail & Related papers (2020-03-03T18:57:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.