Heng Yin, Professor, Department of Computer Science and Engineering, University of California, Riverside

Deep learning, particularly Transformer-based models, has recently gained traction in binary analysis, showing promising outcomes. Despite numerous studies customizing these models for specific applications, the impact of such modifications on performance remains largely unexamined. Our study critically evaluates four custom Transformer models (jTrans, PalmTree, StateFormer, Trex) across various applications, revealing that except for the Masked Language Model (MLM) task, additional pre-training tasks do not significantly enhance learning. Surprisingly, the original BERT model often outperforms these adaptations, indicating that complex modifications and new pre-training tasks may be superfluous. Our findings advocate for focusing on fine-tuning rather than architectural or task-related alterations to improve model performance in binary analysis.

Speaker's Biography: Dr. Heng Yin is a Professor in the Department of Computer Science and Engineering at University of California, Riverside. He obtained his PhD degree from the College of William and Mary in 2009. His research interests lie in computer security, with an emphasis on binary code analysis. His publications appear in top-notch technical conferences and journals, such as IEEE S&P, ACM CCS, USENIX Security, NDSS, ISSTA, ICSE, TSE, TDSC, etc. His research is sponsored by National Science Foundation (NSF), Defense Advanced Research Projects Agency (DARPA), Air Force Office of Scientific Research (AFOSR), and Office of Naval Research (ONR). In 2011, he received the prestigious NSF Career award. He received Google Security and Privacy Research Award, Amazon Research Award, DSN Distinguished Paper Award, and RAID Best Paper Award.

View More Papers

Cross-Origin Web Attacks via HTTP/2 Server Push and Signed...

Pinji Chen (Tsinghua University), Jianjun Chen (Tsinghua University & Zhongguancun Laboratory), Mingming Zhang (Zhongguancun Laboratory), Qi Wang (Tsinghua University), Yiming Zhang (Tsinghua University), Mingwei Xu (Tsinghua University), Haixin Duan (Tsinghua University)

Read More

Density Boosts Everything: A One-stop Strategy for Improving Performance,...

Jianwen Tian (Academy of Military Sciences), Wei Kong (Zhejiang Sci-Tech University), Debin Gao (Singapore Management University), Tong Wang (Academy of Military Sciences), Taotao Gu (Academy of Military Sciences), Kefan Qiu (Beijing Institute of Technology), Zhi Wang (Nankai University), Xiaohui Kuang (Academy of Military Sciences)

Read More

Distributed Function Secret Sharing and Applications

Pengzhi Xing (University of Electronic Science and Technology of China), Hongwei Li (University of Electronic Science and Technology of China), Meng Hao (Singapore Management University), Hanxiao Chen (University of Electronic Science and Technology of China), Jia Hu (University of Electronic Science and Technology of China), Dongxiao Liu (University of Electronic Science and Technology of China)

Read More