Mohammad Naseri (University College London), Jamie Hayes (DeepMind), Emiliano De Cristofaro (University College London & Alan Turing Institute)

Federated Learning (FL) allows multiple participants to train machine learning models collaboratively by keeping their datasets local while only exchanging model updates. Alas, this is not necessarily free from privacy and robustness vulnerabilities, e.g., via membership, property, and backdoor attacks. This paper investigates whether and to what extent one can use differential Privacy (DP) to protect both privacy and robustness in FL. To this end, we present a first-of-its-kind evaluation of Local and Central Differential Privacy (LDP/CDP) techniques in FL, assessing their feasibility and effectiveness.

Our experiments show that both DP variants do defend against backdoor attacks, albeit with varying levels of protection-utility trade-offs, but anyway more effectively than other robustness defenses. DP also mitigates white-box membership inference attacks in FL, and our work is the first to show it empirically. Neither LDP nor CDP, however, defend against property inference. Overall, our work provides a comprehensive, re-usable measurement methodology to quantify the trade-offs between robustness/privacy and utility in differentially private FL.

View More Papers

“So I Sold My Soul“: Effects of Dark Patterns...

Oksana Kulyk (ITU Copenhagen), Willard Rafnsson (IT University of Copenhagen), Ida Marie Borberg, Rene Hougard Pedersen

Read More

Phishing awareness and education – When to best remind?

Benjamin Maximilian Berens (SECUSO, Karlsruhe Institute of Technology), Katerina Dimitrova, Mattia Mossano (SECUSO, Karlsruhe Institute of Technology), Melanie Volkamer (SECUSO, Karlsruhe Institute of Technology)

Read More

Packet-Level Open-World App Fingerprinting on Wireless Traffic

Jianfeng Li (The Hong Kong Polytechnic University), Shuohan Wu (The Hong Kong Polytechnic University), Hao Zhou (The Hong Kong Polytechnic University), Xiapu Luo (The Hong Kong Polytechnic University), Ting Wang (Penn State), Yangyang Liu (The Hong Kong Polytechnic University), Xiaobo Ma (Xi'an Jiaotong University)

Read More

EMS: History-Driven Mutation for Coverage-based Fuzzing

Chenyang Lyu (Zhejiang University), Shouling Ji (Zhejiang University), Xuhong Zhang (Zhejiang University & Zhejiang University NGICS Platform), Hong Liang (Zhejiang University), Binbin Zhao (Georgia Institute of Technology), Kangjie Lu (University of Minnesota), Raheem Beyah (Georgia Institute of Technology)

Read More