Privacy-Preserving Student Learning with Differentially Private Data-Free Distillation

Bochao Liu, Jianghu Lu, Pengju Wang, Junjie Zhang, Dan Zeng, Zhenxing Qian, Shiming Ge*

*Corresponding author for this work

Research output: Chapter in Book or Report/Conference proceedingConference Proceedingpeer-review

2 Citations (Scopus)

Abstract

Deep learning models can achieve high inference accuracy by extracting rich knowledge from massive well-annotated data, but may pose the risk of data privacy leakage in practical deployment. In this paper, we present an effective teacher-student learning approach to train privacy-preserving deep learning models via differentially private data-free distillation. The main idea is generating synthetic data to learn a student that can mimic the ability of a teacher well-trained on private data. In the approach, a generator is first pretrained in a data-free manner by incorporating the teacher as a fixed discriminator. With the generator, massive synthetic data can be generated for model training without exposing data privacy. Then, the synthetic data is fed into the teacher to generate private labels. Towards this end, we propose a label differential privacy algorithm termed selective randomized response to protect the label information. Finally, a student is trained on the synthetic data with the supervision of private labels. In this way, both data privacy and label privacy are well protected in a unified framework, leading to privacy-preserving models. Extensive experiments and analysis clearly demonstrate the effectiveness of our approach.

Original languageEnglish
Title of host publication2022 IEEE 24th International Workshop on Multimedia Signal Processing, MMSP 2022
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781665471893
DOIs
Publication statusPublished - 2022
Externally publishedYes
Event24th IEEE International Workshop on Multimedia Signal Processing, MMSP 2022 - Shanghai, China
Duration: 26 Sept 202228 Sept 2022

Publication series

Name2022 IEEE 24th International Workshop on Multimedia Signal Processing, MMSP 2022

Conference

Conference24th IEEE International Workshop on Multimedia Signal Processing, MMSP 2022
Country/TerritoryChina
CityShanghai
Period26/09/2228/09/22

Keywords

  • differential privacy
  • knowledge distillation
  • teacher-student learning

Fingerprint

Dive into the research topics of 'Privacy-Preserving Student Learning with Differentially Private Data-Free Distillation'. Together they form a unique fingerprint.

Cite this