Tianhang Zheng (University of Missouri-Kansas City), Baochun Li (University of Toronto)

Recent work in ICML’22 established a connection between dataset condensation (DC) and differential privacy (DP), which is unfortunately problematic. To correctly connect DC and DP, we propose two differentially private dataset condensation (DPDC) algorithms—LDPDC and NDPDC. LDPDC is a linear DC algorithm that can be executed on a low-end Central Processing Unit (CPU), while NDPDC is a nonlinear DC algorithm that leverages neural networks to extract and match the latent representations between real and synthetic data. Through extensive evaluations, we demonstrate that LDPDC has comparable performance to recent DP generative methods despite its simplicity. NDPDC provides acceptable DP guarantees with a mild utility loss, compared to distribution matching (DM). Additionally, NDPDC allows a flexible trade-off between the synthetic data utility and DP budget.

View More Papers

A Unified Symbolic Analysis of WireGuard

Pascal Lafourcade (Universite Clermont Auvergne), Dhekra Mahmoud (Universite Clermont Auvergne), Sylvain Ruhault (Agence Nationale de la Sécurité des Systèmes d'Information)

Read More

Timing Channels in Adaptive Neural Networks

Ayomide Akinsanya (Stevens Institute of Technology), Tegan Brennan (Stevens Institute of Technology)

Read More

Benchmarking transferable adversarial attacks

Zhibo Jin (The University of Sydney), Jiayu Zhang (Suzhou Yierqi), Zhiyu Zhu, Huaming Chen (The University of Sydney)

Read More

Investigating the Impact of Evasion Attacks Against Automotive Intrusion...

Paolo Cerracchio, Stefano Longari, Michele Carminati, Stefano Zanero (Politecnico di Milano)

Read More