Tianhang Zheng (University of Missouri-Kansas City), Baochun Li (University of Toronto)

Recent work in ICML’22 established a connection between dataset condensation (DC) and differential privacy (DP), which is unfortunately problematic. To correctly connect DC and DP, we propose two differentially private dataset condensation (DPDC) algorithms—LDPDC and NDPDC. LDPDC is a linear DC algorithm that can be executed on a low-end Central Processing Unit (CPU), while NDPDC is a nonlinear DC algorithm that leverages neural networks to extract and match the latent representations between real and synthetic data. Through extensive evaluations, we demonstrate that LDPDC has comparable performance to recent DP generative methods despite its simplicity. NDPDC provides acceptable DP guarantees with a mild utility loss, compared to distribution matching (DM). Additionally, NDPDC allows a flexible trade-off between the synthetic data utility and DP budget.

View More Papers

More Lightweight, yet Stronger: Revisiting OSCORE’s Replay Protection

Konrad-Felix Krentz (Uppsala University), Thiemo Voigt (Uppsala University, RISE Computer Science)

Read More

Work-in-Progress: A Large-Scale Long-term Analysis of Online Fraud across...

Yi Han, Shujiang Wu, Mengmeng Li, Zixi Wang, and Pengfei Sun (F5)

Read More

Aligning Confidential Computing with Cloud-native ML Platforms

Angelo Ruocco, Chris Porter, Claudio Carvalho, Daniele Buono, Derren Dunn, Hubertus Franke, James Bottomley, Marcio Silva, Mengmei Ye, Niteesh Dubey, Tobin Feldman-Fitzthum (IBM Research)

Read More

VETEOS: Statically Vetting EOSIO Contracts for the “Groundhog Day”...

Levi Taiji Li (University of Utah), Ningyu He (Peking University), Haoyu Wang (Huazhong University of Science and Technology), Mu Zhang (University of Utah)

Read More