Iman Hosseini, Brendan Dolan-Gavitt (NYU)

The problem of reversing the compilation process, decompilation, is an important tool in reverse engineering of computer software. Recently, researchers have proposed using techniques from neural machine translation to automate the process in decompilation. Although such techniques hold the promise of targeting a wider range of source and assembly languages, to date they have primarily targeted C code. In this paper we argue that existing neural decompilers have achieved higher accuracy at the cost of requiring language-specific domain knowledge such as tokenizers and parsers to build an abstract syntax tree (AST) for the source language, which increases the overhead of supporting new languages. We explore a different tradeoff that, to the extent possible, treats the assembly and source languages as plain text, and show that this allows us to build a decompiler that is easily retargetable to new languages. We evaluate our prototype decompiler, Beyond The C (BTC), on Go, Fortran, OCaml, and C, and examine the impact of parameters such as tokenization and training data selection on the quality of decompilation, finding that it achieves comparable decompilation results to prior work in neural decompilation with significantly less domain knowledge. We will release our training data, trained decompilation models, and code to help encourage future research into language-agnostic decompilation.

View More Papers

B2R2: Building an Efficient Front-End for Binary Analysis

Minkyu Jung (KAIST), Soomin Kim (KAIST), HyungSeok Han (KAIST), Jaeseung Choi (KAIST), Sang Kil Cha (KAIST)

Read More

o-glassesX: Compiler Provenance Recovery with Attention Mechanism from a...

Yuhei Otsubo (National Police Agency, Tokyo, Japan), Akira Otsuka (Institute of information Security, Japan), Mamoru Mimura (National Defense Academy, Japan), Takeshi Sakaki (The University of Tokyo, Japan), Hiroshi Ukegawa (National Police Agency, Tokyo, Japan)

Read More

Demo #10: Hijacking Connected Vehicle Alexa Skills

Wenbo Ding (University at Buffalo), Long Cheng (Clemson University), Xianghang Mi (University of Science and Technology of China), Ziming Zhao (University at Buffalo) and Hongxin Hu (University at Buffalo)

Read More